[go: up one dir, main page]

US20210178063A1 - Controlling medication delivery system operation and features based on automatically detected user stress level - Google Patents

Controlling medication delivery system operation and features based on automatically detected user stress level Download PDF

Info

Publication number
US20210178063A1
US20210178063A1 US17/118,105 US202017118105A US2021178063A1 US 20210178063 A1 US20210178063 A1 US 20210178063A1 US 202017118105 A US202017118105 A US 202017118105A US 2021178063 A1 US2021178063 A1 US 2021178063A1
Authority
US
United States
Prior art keywords
stress
user
data
medication delivery
delivery system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/118,105
Inventor
Neha J. Parikh
Salman Monirabbasi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Medtronic Minimed Inc
Original Assignee
Medtronic Minimed Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Medtronic Minimed Inc filed Critical Medtronic Minimed Inc
Priority to US17/118,105 priority Critical patent/US20210178063A1/en
Assigned to MEDTRONIC MINIMED, INC. reassignment MEDTRONIC MINIMED, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MONIRABBASI, SALMAN, PARIKH, NEHA J.
Publication of US20210178063A1 publication Critical patent/US20210178063A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • A61M5/14Infusion devices, e.g. infusing by gravity; Blood infusion; Accessories therefor
    • A61M5/168Means for controlling media flow to the body or for metering media to the body, e.g. drip meters, counters ; Monitoring media flow to the body
    • A61M5/172Means for controlling media flow to the body or for metering media to the body, e.g. drip meters, counters ; Monitoring media flow to the body electrical or electronic
    • A61M5/1723Means for controlling media flow to the body or for metering media to the body, e.g. drip meters, counters ; Monitoring media flow to the body electrical or electronic using feedback of body parameters, e.g. blood-sugar, pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
    • A61B5/14532Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue for measuring glucose, e.g. by tissue impedance measurement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • A61B5/6815Ear
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7246Details of waveform analysis using correlation, e.g. template matching or determination of similarity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • A61M5/14Infusion devices, e.g. infusing by gravity; Blood infusion; Accessories therefor
    • A61M5/168Means for controlling media flow to the body or for metering media to the body, e.g. drip meters, counters ; Monitoring media flow to the body
    • A61M5/172Means for controlling media flow to the body or for metering media to the body, e.g. drip meters, counters ; Monitoring media flow to the body electrical or electronic
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/18General characteristics of the apparatus with alarm
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/35Communication
    • A61M2205/3546Range
    • A61M2205/3553Range remote, e.g. between patient's home and doctor's office
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/50General characteristics of the apparatus with microprocessors or computers
    • A61M2205/502User interfaces, e.g. screens or keyboards
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/50General characteristics of the apparatus with microprocessors or computers
    • A61M2205/52General characteristics of the apparatus with microprocessors or computers with memories providing a history of measured variating parameters of apparatus or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/58Means for facilitating use, e.g. by people with impaired vision
    • A61M2205/587Lighting arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2209/00Ancillary equipment
    • A61M2209/01Remote controllers for specific apparatus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/20Blood composition characteristics
    • A61M2230/201Glucose concentration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/63Motion, e.g. physical activity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • A61M5/14Infusion devices, e.g. infusing by gravity; Blood infusion; Accessories therefor
    • A61M5/142Pressure infusion, e.g. using pumps
    • A61M5/14244Pressure infusion, e.g. using pumps adapted to be carried by the patient, e.g. portable on the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • A61M5/178Syringes
    • A61M5/24Ampoule syringes, i.e. syringes with needle for use in combination with replaceable ampoules or carpules, e.g. automatic

Definitions

  • the present technology is generally related to the control, operation, and regulation of a medication delivery system in a way that leverages an automated stress detection system.
  • a typical insulin infusion device includes a fluid pump mechanism and an associated drive system that actuates a plunger or piston of a fluid reservoir to deliver fluid medication from the reservoir to the body of a patient via a fluid delivery conduit between the reservoir and the body of a patient.
  • Use of infusion pump therapy has been increasing, especially for delivering insulin to diabetic patients.
  • Control schemes have been developed to allow insulin infusion devices to monitor and regulate a patient's blood glucose level in a substantially continuous and autonomous manner.
  • An insulin infusion device can be operated in an automatic mode wherein basal insulin is delivered at a rate that is automatically adjusted for the user.
  • an insulin infusion device can be operated to automatically calculate, recommend, and deliver insulin boluses as needed (e.g., to compensate for meals consumed by the user).
  • the amount of an insulin bolus should be accurately calculated and administered to maintain the user's blood glucose within the desired range.
  • an automatically generated and delivered insulin bolus should safely manage the user's blood glucose level and keep it above a defined threshold level.
  • an insulin infusion device operating in an automatic mode uses continuous glucose sensor data and control algorithms to regulate the user's blood glucose, based on a target glucose setpoint setting and user-initiated meal announcements that typically include estimations of the amount of carbohydrates to be consumed in an upcoming meal.
  • the subject matter of this disclosure generally relates to a system that automatically detects when a user is experiencing stress and, in response to the detection, regulates, controls, or adjusts the operation of a medication delivery system in a stress-correlated manner.
  • the present disclosure provides a method of operating a medication delivery system having a fluid pump mechanism and at least one controller that regulates operation of the fluid pump mechanism to deliver medication from the medication delivery system.
  • the method involves: operating the medication delivery system in a first mode of operation to automatically deliver the medication to a user in accordance with a therapy control algorithm; receiving stress-identifying data that indicates a current stress status of the user, the stress-identifying data generated at least in part from gesture data for the user, the gesture data provided by a gesture-based physical behavior detection system; determining, from the stress-identifying data, that the user is under stress while the medication delivery system is operating in the first mode; and in response to the determining, operating the medication delivery system in a second mode of operation to automatically deliver the medication to the user in accordance with a stress-correlated therapy control algorithm.
  • the second mode of operation compensates for user stress as determined from the stress-identifying data.
  • the disclosure provides a medication delivery system having: a fluid pump mechanism; at least one controller that regulates operation of the fluid pump mechanism to deliver insulin from the medication delivery system; and at least one memory element associated with the at least one controller, the at least one memory element storing processor-executable instructions configurable to be executed by the at least one controller to perform a method of controlling operation of the medication delivery system.
  • the method involves: operating the medication delivery system in a first mode of operation to automatically deliver the insulin to a user in accordance with a therapy control algorithm; receiving stress-identifying data that indicates a current stress status of the user, the stress-identifying data generated at least in part from gesture data for the user, the gesture data provided by a gesture-based physical behavior detection system; determining, from the stress-identifying data, that the user is under stress while the medication delivery system is operating in the first mode; and in response to the determining, operating the medication delivery system in a second mode of operation to automatically deliver the insulin to the user in accordance with a stress-correlated therapy control algorithm, wherein the second mode of operation compensates for user stress as determined from the stress-identifying data.
  • the disclosure provides a system having: an insulin infusion device that regulates delivery of insulin to a user; a gesture-based physical behavior detection system configured to generate gesture data for the user, and configured to communicate the gesture data; and at least one controller that controls operation of the insulin infusion device.
  • the at least one controller is configured to: operate the insulin infusion device in a first mode of operation to automatically deliver the insulin to the user in accordance with a therapy control algorithm; process stress-identifying data that indicates a current stress status of the user, the stress-identifying data generated at least in part from gesture data provided by the gesture-based physical behavior detection system; determine, from the stress-identifying data, that the user is under stress while the insulin infusion device is operating in the first mode; and in response to determining that the user is under stress, operate the insulin infusion device in a second mode of operation to automatically deliver the insulin to the user in accordance with a stress-correlated therapy control algorithm.
  • the second mode of operation compensates for user stress as determined from the stress-identifying data.
  • FIG. 1 is a simplified block diagram representation of an exemplary embodiment of a system that includes a medication delivery system that responds to patient stress levels as indicated by the output of a gesture-based physical behavior detection system;
  • FIG. 2 is a plan view of an exemplary embodiment of an insulin infusion device that is suitable for use as the medication delivery system shown in FIG. 1 ;
  • FIG. 3 is a top perspective view of an embodiment of an insulin infusion device implemented as a patch pump device that is suitable for use as the medication delivery system shown in FIG. 1 ;
  • FIG. 4 is a perspective view of an exemplary embodiment of a smart insulin pen that is suitable for use as the medication delivery system shown in FIG. 1 ;
  • FIG. 5 is a perspective view of an exemplary embodiment of a smart pen accessory that is suitable for use with the medication delivery system shown in FIG. 1 ;
  • FIG. 6 is a block diagram representation of an exemplary embodiment of a computer-based or processor-based device suitable for deployment in the system shown in FIG. 1 ;
  • FIG. 7 is a block diagram representation of a closed loop glucose control system arranged in accordance with certain embodiments.
  • FIG. 8 is a block diagram representation of a gesture-based physical behavior detection system arranged in accordance with certain embodiments.
  • FIG. 9 is a flow chart that illustrates an infusion device control process according to certain embodiments.
  • FIG. 10 is a flow chart that illustrates a training process according to certain embodiments.
  • the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit.
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable logic arrays
  • processors may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
  • FIG. 1 is a simplified block diagram representation of an exemplary embodiment of a system 100 that responds to patient stress levels by adjusting at least one setting, function, or therapy-related operation of a medication delivery system 102 .
  • the medication delivery system 102 responds to patient stress levels as indicated by the output of a gesture-based physical behavior detection system 104 and/or the output of at least one ancillary sensor, detector, or measurement system 106 (hereinafter referred to as ancillary system(s) 106 ).
  • Certain embodiments of the system 100 include, without limitation: the medication delivery system 102 (or device) that regulates delivery of medication to a user; at least one gesture-based physical behavior detection system 104 that monitors user behavior and/or status to obtain gesture data that indicates whether the user is under stress; at least one ancillary system 106 ; at least one user device 108 that includes or cooperates with a suitably written and configured patient care application 110 ; an analyte sensor 112 to measure a physiological characteristic of the user, such that sensor data obtained from the analyte sensor 112 can be used to control, regulate, or otherwise influence the operation of the medication delivery system 102 ; and at least one patient history and outcomes database 114 .
  • the system includes at least one data processing system 116 , which may be in communication with any or all of the other components of the system 100 .
  • Other configurations and topologies for the system 100 are also contemplated here, such as a system that includes additional intermediary, interface, or data repeating devices in the data path between a sending device and a receiving device.
  • At least some of the components of the system 100 are communicatively coupled with one another to support data communication, signaling, and/or transmission of control commands as needed, via at least one communications network 120 .
  • the at least one communications network 120 may support wireless data communication and/or data communication using tangible data communication links.
  • FIG. 1 depicts network communication links in a simplified manner.
  • the system 100 may cooperate with and leverage any number of wireless and any number of wired data communication networks maintained or operated by various entities and providers. Accordingly, communication between the various components of the system 100 may involve multiple network links and different data communication protocols.
  • the network can include or cooperate with any of the following, without limitation: a local area network; a wide area network; the Internet; a personal area network; a near-field data communication link; a cellular communication network; a satellite communication network; a video services or television broadcasting network; a network onboard a vehicle; or the like.
  • the components of the system 100 may be suitably configured to support a variety of wireless and wired data communication protocols, technologies, and techniques as needed for compatibility with the at least one communication network 120 .
  • the system 100 can support any type of medication delivery system 102 that is compatible with the features and functionality described here.
  • the medication delivery system 102 may be realized as a user-activated or user-actuated fluid delivery device, such as a manual syringe, an injection pen, or the like.
  • the medication delivery system 102 may be implemented as an electronic device that is operated to regulate the delivery of medication fluid to the user.
  • the medication delivery system 102 includes or is realized as an insulin infusion device, e.g., a portable patient-worn or patient-carried insulin pump.
  • the analyte sensor 112 includes or is realized as a glucose meter, a glucose sensor, or a continuous glucose monitor.
  • infusion pumps may be of the type described in, but not limited to, U.S. Pat. Nos.: 4,562,751; 4,685,903; 5,080,653; 5,505,709; 5,097,122; 6,485,465; 6,554,798; 6,558,320; 6,558,351; 6,641,533; 6,659,980; 6,752,787; 6,817,990; 6,932,584; and 7,621,893; each of which are herein incorporated by reference.
  • FIG. 2 is a plan view of an exemplary embodiment of an insulin infusion device 130 suitable for use as the medication delivery system 102 shown in FIG. 1 .
  • the insulin infusion device 130 is a portable medical device designed to be carried or worn by the patient.
  • the illustrated embodiment of the insulin infusion device 130 includes a housing 132 adapted to receive an insulin-containing reservoir (hidden from view in FIG. 2 ).
  • An opening in the housing 132 accommodates a fitting 134 (or cap) for the reservoir, with the fitting 134 being configured to mate or otherwise interface with tubing 136 of an infusion set 138 that provides a fluid path to/from the body of the user. In this manner, fluid communication from the interior of the insulin reservoir to the user is established via the tubing 136 .
  • the illustrated version of the insulin infusion device 130 includes a human-machine interface (HMI) 140 (or user interface) that includes elements that can be manipulated by the user to administer a bolus of fluid (e.g., insulin), to change therapy settings, to change user preferences, to select display features, and the like.
  • HMI human-machine interface
  • the insulin infusion device 130 also includes a display 142 , such as a liquid crystal display (LCD) or another suitable display technology, that can be used to present various types of information or data to the user, such as, without limitation: the current glucose level of the patient; the time; a graph or chart of the patient's glucose level versus time; device status indicators; etc.
  • the insulin infusion device 130 may be configured and controlled to support other features and interactive functions described in more detail below.
  • FIG. 3 is a top perspective view of an embodiment of an insulin infusion device 146 implemented as a patch pump device that is suitable for use as the medication delivery system 102 shown in FIG. 1 .
  • the insulin infusion device 146 can be implemented as a combination device that includes an insertable insulin delivery cannula and an insertable glucose sensor (both of which are hidden from view in FIG. 3 ). In such an implementation, the glucose sensor may take the place of the separate analyte sensor 112 shown in FIG. 1 .
  • the insulin infusion device 146 includes a housing 148 that serves as a shell for a variety of internal components.
  • FIG. 3 shows the insulin infusion device 146 with a removable fluid cartridge module 150 installed and secured therein.
  • the housing 148 is suitably configured to receive, secure, and release the removable fluid cartridge module 150 .
  • the insulin infusion device 146 includes at least one user interface feature, which can be actuated by the patient as needed.
  • the illustrated embodiment of the insulin infusion device 146 includes a button 152 that is physically actuated.
  • the button 152 can be a multipurpose user interface if so desired to make it easier for the user to operate the insulin infusion device 146 .
  • the button 152 can be used in connection with one or more of the following functions, without limitation: waking up the processor and/or electronics of the insulin infusion device 146 ; triggering an insertion mechanism to insert a fluid delivery cannula and/or an analyte sensor into the subcutaneous space or similar region of the user; configuring one or more settings of the insulin infusion device 146 ; initiating delivery of medication fluid from the fluid cartridge module 150 ; initiating a fluid priming operation; disabling alerts or alarms generated by the insulin infusion device 146 ; and the like.
  • the insulin infusion device 146 can employ a slider mechanism, a pin, a lever, a switch, a touch-sensitive element, or the like.
  • the insulin infusion device 146 may be configured and controlled to support other features and interactive functions described in more detail below.
  • FIG. 4 is a perspective view of an exemplary embodiment of a smart insulin pen 160 suitable for use as the medication delivery system shown in FIG. 1 .
  • the pen 160 includes an injector body 162 and a cap 164 .
  • FIG. 4 shows the cap 164 removed from the injector body 162 , such that a delivery needle 166 is exposed.
  • the pen 160 includes suitably configured electronics and processing capability to communicate with an application running on a user device, such as a smartphone, to support various functions and features such as: tracking active insulin; calculating insulin dosages (boluses); tracking insulin dosages; monitoring insulin supply levels; patient reminders and notifications; and patient status reporting.
  • the smart insulin pen 160 can receive insulin dosage recommendations or instructions and/or recommended dosing times (or a recommended dosing schedule).
  • the smart insulin pen 160 may be configured and controlled to support other features and interactive functions described in more detail below.
  • FIG. 5 is a perspective view of an exemplary embodiment of a smart pen accessory 170 that is suitable for use with the medication delivery system 102 shown in FIG. 1 .
  • the smart pen accessory 170 cooperates with a “non-smart” insulin pen that lacks the intelligence and functionality of a smart insulin pen (as described above).
  • the smart pen accessory 170 can be realized as a pen cap, a clip-on apparatus, a sleeve, or the like.
  • the smart pen accessory 170 is attached to an insulin pen 172 such that the smart pen accessory 170 can measure the amount of insulin delivered by the insulin pen 172 .
  • the insulin dosage data is stored by the smart pen accessory 170 along with corresponding date/time stamp information.
  • the smart pen accessory 170 can receive, store, and process additional patient-related or therapy-related data, such as glucose data.
  • additional patient-related or therapy-related data such as glucose data.
  • the smart pen accessory 170 may also support various features and functions described above in the context of the smart insulin pen 160 .
  • the smart pen accessory 170 may be configured to receive insulin dosage recommendations or instructions and/or recommended dosing times (or a recommended dosing schedule).
  • the smart pen accessory 170 may be configured and controlled to support other features and interactive functions described in more detail below.
  • a fluid infusion device (such as an insulin infusion device) includes a fluid pump mechanism having a motor or other actuation arrangement that is operable to linearly displace a plunger (or stopper) of a fluid reservoir provided within the fluid infusion device to deliver a dosage of fluid medication, such as insulin, to the body of a user.
  • Dosage commands that govern operation of the motor may be generated in an automated manner in accordance with the delivery control scheme associated with a particular operating mode, and the dosage commands may be generated in a manner that is influenced by a current (or most recent) measurement of a physiological condition in the body of the user.
  • a closed-loop or automatic operating mode can be used to generate insulin dosage commands based on a difference between a current (or most recent) measurement of the interstitial fluid glucose level in the body of the user and a target (or reference) glucose setpoint value.
  • the rate of infusion may vary as the difference between a current measurement value and the target measurement value fluctuates.
  • the subject matter is described herein in the context of the infused fluid being insulin for regulating a glucose level of a user (or patient); however, it should be appreciated that many other fluids may be administered through infusion, and the subject matter described herein is not necessarily limited to use with insulin.
  • the analyte sensor 112 may communicate sensor data to the medication delivery system 102 for use in regulating or controlling operation of the medication delivery system 102 .
  • the analyte sensor 112 may communicate sensor data to one or more other components in the system 100 , such as, without limitation: a user device 108 (for use with the patient care application 110 ); a data processing system 116 ; and/or a patient history and outcomes database 114 .
  • the system 100 can support any number of user devices 108 linked to the particular user or patient.
  • a user device 108 may be, without limitation: a smartphone device; a laptop, desktop, or tablet computer device; a medical device; a wearable device; a global positioning system (GPS) receiver device; a system, component, or feature onboard a vehicle; a smartwatch device; a television system; a household appliance; a video game device; a media player device; or the like.
  • GPS global positioning system
  • the medication delivery system 102 and the at least one user device 108 are owned by, operated by, or otherwise linked to a user/patient. Any given user device 108 can host, run, or otherwise execute the patient care application 110 .
  • the user device 108 is implemented as a smartphone with the patient care application 110 installed thereon.
  • the patient care application 110 is implemented in the form of a website or webpage, e.g., a website of a healthcare provider, a website of the manufacturer, supplier, or retailer of the medication delivery system 102 , or a website of the manufacturer, supplier, or retailer of the analyte sensor 112 .
  • the medication delivery system 102 executes the patient care application 110 as a native function.
  • the features or output of the gesture-based physical behavior detection system 104 and/or the ancillary system(s) 106 can be used to influence features, functions, and/or therapy-related operations of the medication delivery system 102 .
  • the systems 104 , 106 may be suitably configured and operated to generate and provide output (e.g., data, control signals, markers, or flags) that indicates user stress, such that the medication delivery system 102 can dynamically respond in a stress-correlated manner to compensate for detected user stress.
  • the gesture-based physical behavior detection system 104 includes one or more sensors, detectors, measurement devices, and/or readers to automatically detect certain user gestures that correlate to user stress (e.g., work-related physical activity, commuting, arguing, fighting, stress or nervous eating, stress or nervous drinking) or lack thereof (e.g., napping, normal eating or drinking, painting, jogging, or dancing).
  • the gesture-based physical behavior detection system 104 may communicate gesture data to the medication delivery system 102 , the user device 108 , and/or the data processing system 116 for processing in an appropriate manner for use in regulating or controlling certain functions of the medication delivery system 102 .
  • the gesture data may be communicated to a user device 108 , such that the user device 108 can process the gesture data and inform the user or the medication delivery system 102 as needed (e.g., remotely regulate or control certain functions of the medication delivery system 102 ).
  • the gesture-based physical behavior detection system 104 may communicate the gesture data to one or more cloud computing systems or servers (such as a remote data processing system 116 ) for appropriate processing and handling in the manner described herein.
  • an ancillary system 106 may include one or more sensors, detectors, measurement devices, and/or readers that obtain ancillary user status data that correlates to user stress or lack thereof.
  • an ancillary system 106 may include, cooperate with, or be realized as any of the following, without limitation: a heartrate monitor linked to the user; a blood pressure monitor linked to the user; a respiratory rate monitor linked to the user; a vital signs monitor linked to the user; a thermometer (for the user's body temperature and/or the environmental temperature); a sweat detector linked to the user; an activity tracker linked to the user; a global positioning system (GPS); a clock, calendar, or appointment application linked to the user; a pedometer linked to the user; or the like.
  • GPS global positioning system
  • An ancillary system 106 may be configured and operated to communicate its output (user status data) to one or more components of the system 100 for analysis, processing, and handling in the manner described herein.
  • user status data obtained from one or more ancillary systems 106 supplements the gesture data obtained from the gesture-based physical behavior detection system 104 , such that periods of user stress and corresponding stress levels are accurately and reliably detected.
  • the gesture-based physical behavior detection system 104 and the medication delivery system 102 are implemented as physically distinct and separate components, as depicted in FIG. 1 .
  • the gesture-based physical behavior detection system 104 is external to the medication delivery system 102 and is realized as an ancillary component, relative to the medication delivery system 102 .
  • the medication delivery system 102 and the gesture-based physical behavior detection system 104 can be combined into a single hardware component or provided as a set of attached hardware devices.
  • the medication delivery system 102 may include the gesture-based physical behavior detection system 104 or integrate the functionality of the system 104 .
  • the analyte sensor 112 can be incorporated with the medication delivery system 102 or the gesture-based physical behavior detection system 104 .
  • the at least one patient history and outcomes database 114 includes historical data related to the user's physical condition, physiological response to the medication regulated by the medication delivery system 102 , stress-related or stress-correlated factors, and the like.
  • the database 114 can maintain any of the following, without limitation: historical glucose data and corresponding date/time stamp information; insulin delivery and dosage information; user-entered stress markers or indicators; gesture data (provided by the gesture-based physical behavior detection system 104 ) and corresponding date/time stamp information; ancillary user status data (provided by one or more ancillary systems 106 ) and corresponding date/time stamp data; diet or food intake history for the user; physical activity data, such as an exercise log; and any other information that may be generated by or used by the system 100 for purposes of controlling the operation of the medication delivery system 102 .
  • the at least one patient history and outcomes database 114 can receive and maintain training data that is utilized to train, configure, and initialize the system 100 based on historical user behavior, physiological state, operation of the medication delivery system 102 , and user-identified periods of stress.
  • a patient history and outcomes database 114 may reside at a user device 108 , at the medication delivery system 102 , at a data processing system 116 , or at any network-accessible location (e.g., a cloud-based database or server system). In certain embodiments, a patient history and outcomes database 114 may be included with the patient care application 110 . The patient history and outcomes database 114 enables the system 100 to generate recommendations, warnings, predictions, and guidance for the user and/or to regulate the manner in which the medication delivery system 102 administers therapy to the user, based on detected stress levels and periods of stress.
  • FIG. 6 is a simplified block diagram representation of an exemplary embodiment of a computer-based or processor-based device 200 that is suitable for deployment in the system 100 shown in FIG. 1 .
  • the illustrated embodiment of the device 200 is intended to be a high-level and generic representation of one suitable platform.
  • any computer-based or processor-based component of the system 100 can utilize the architecture of the device 200 .
  • the illustrated embodiment of the device 200 generally includes, without limitation: at least one controller (or processor) 202 ; a suitable amount of memory 204 that is associated with the at least one controller 202 ; device-specific items 206 (including, without limitation: hardware, software, firmware, user interface (UI), alerting, and notification features); a power supply 208 such as a disposable or rechargeable battery; a communication interface 210 ; at least one application programming interface (API) 212 ; and a display element 214 .
  • API application programming interface
  • an implementation of the device 200 may include additional elements, components, modules, and functionality configured to support various features that are unrelated to the primary subject matter described here.
  • the device 200 may include certain features and elements to support conventional functions that might be related to the particular implementation and deployment of the device 200 .
  • the elements of the device 200 may be coupled together via at least one bus or any suitable interconnection architecture 216 .
  • the at least one controller 202 may be implemented or performed with a general purpose processor, a content addressable memory, a microcontroller unit, a digital signal processor, an application specific integrated circuit, a field programmable gate array, any suitable programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination designed to perform the functions described here. Moreover, the at least one controller 202 may be implemented as a combination of computing devices, e.g., a combination of a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other such configuration.
  • the memory 204 may be realized as at least one memory element, device, module, or unit, such as: RAM memory, flash memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
  • the memory 204 can be coupled to the at least one controller 202 such that the at least one controller 202 can read information from, and write information to, the memory 204 .
  • the memory 204 may be integral to the at least one controller 202 .
  • the at least one controller 202 and the memory 204 may reside in an ASIC.
  • At least a portion of the memory 204 can be realized as a computer storage medium that is operatively associated with the at least one controller 202 , e.g., a tangible, non-transitory computer-readable medium having computer-executable instructions stored thereon.
  • the computer-executable instructions are configurable to be executed by the at least one controller 202 to cause the at least one controller 202 to perform certain tasks, operations, functions, and processes that are specific to the particular embodiment.
  • the memory 204 may represent one suitable implementation of such computer-readable media.
  • the device 200 could receive and cooperate with computer-readable media (not separately shown) that is realized as a portable or mobile component or platform, e.g., a portable hard drive, a USB flash drive, an optical disc, or the like.
  • the device-specific items 206 may vary from one embodiment of the device 200 to another.
  • the device-specific items 206 will support: sensor device operations when the device 200 is realized as a sensor device; smartphone features and functionality when the device 200 is realized as a smartphone; activity tracker features and functionality when the device 200 is realized as an activity tracker; smart watch features and functionality when the device 200 is realized as a smart watch; medical device features and functionality when the device is realized as a medical device; etc.
  • certain portions or aspects of the device-specific items 206 may be implemented in one or more of the other blocks depicted in FIG. 6 .
  • the UI of the device 200 may include or cooperate with various features to allow a user to interact with the device 200 .
  • the UI may include various human-to-machine interfaces, e.g., a keypad, keys, a keyboard, buttons, switches, knobs, a touchpad, a joystick, a pointing device, a virtual writing tablet, a touch screen, a microphone, or any device, component, or function that enables the user to select options, input information, or otherwise control the operation of the device 200 .
  • the UI may include one or more graphical user interface (GUI) control elements that enable a user to manipulate or otherwise interact with an application via the display element 214 .
  • GUI graphical user interface
  • the display element 214 and/or the device-specific items 206 may be utilized to generate, present, render, output, and/or annunciate alerts, alarms, messages, or notifications that are associated with operation of the medication delivery system 102 , associated with a status or condition of the user, associated with operation, status, or condition of the system 100 , etc.
  • the communication interface 210 facilitates data communication between the device 200 and other components as needed during the operation of the device 200 .
  • the communication interface 210 can be employed to transmit or stream device-related control data, patient-related user status (e.g., gesture data or status data), device-related status or operational data, sensor data, calibration data, and the like.
  • patient-related user status e.g., gesture data or status data
  • device-related status or operational data e.g., sensor data, calibration data, and the like.
  • sensor data e.g., a sensor data
  • calibration data e.g., a data communication protocol
  • the particular configuration and functionality of the communication interface 210 can vary depending on the hardware platform and specific implementation of the device 200 .
  • an embodiment of the device 200 may support wireless data communication and/or wired data communication, using various data communication protocols.
  • the communication interface 210 could support one or more wireless data communication protocols, techniques, or methodologies, including, without limitation: RF; IrDA (infrared); Bluetooth; BLE; ZigBee (and other variants of the IEEE 802.15 protocol); IEEE 802.11 (any variation); IEEE 802.16 (WiMAX or any other variation); Direct Sequence Spread Spectrum; Frequency Hopping Spread Spectrum; cellular/wireless/cordless telecommunication protocols; wireless home network communication protocols; paging network protocols; magnetic induction; satellite data communication protocols; wireless hospital or health care facility network protocols such as those operating in the WMTS bands; GPRS; and proprietary wireless data communication protocols such as variants of Wireless USB.
  • the communication interface 210 could support one or more wired/cabled data communication protocols, including, without limitation: Ethernet; powerline; home network communication protocols; USB; IEEE 1394 (Firewire); hospital network communication protocols; and proprietary data communication protocols.
  • the at least one API 212 supports communication and interactions between software applications and logical components that are associated with operation of the device 200 .
  • one or more APIs 212 may be configured to facilitate compatible communication and cooperation with the patient care application 110 , and to facilitate receipt and processing of data from sources external to the device 200 (e.g., databases or remote devices and systems).
  • the display element 214 is suitably configured to enable the device 200 to render and display various screens, recommendation messages, alerts, alarms, notifications, GUIs, GUI control elements, drop down menus, auto-fill fields, text entry fields, message fields, or the like.
  • the display element 214 may also be utilized for the display of other information during the operation of the device 200 , as is well understood.
  • the specific configuration, operating characteristics, size, resolution, and functionality of the display element 214 can vary depending upon the implementation of the device 200 .
  • FIG. 7 is a simplified block diagram representation of a closed loop glucose control system 300 arranged in accordance with certain embodiments.
  • the system 300 depicted in FIG. 7 functions to regulate the rate of fluid infusion into a body of a user based on feedback from an analyte concentration measurement taken from the body, along with other information (e.g., history of insulin delivered, exercise or activity indications).
  • the system 300 is implemented as an automated control system for regulating the rate of insulin infusion into the body of a user based on a glucose concentration measurement taken from the body.
  • the system 300 is designed to model the physiological response of the user to control an insulin infusion device 302 in an appropriate manner to release insulin 304 into the body 306 of the user in a similar concentration profile as would be created by fully functioning human ⁇ -cells when responding to changes in blood glucose concentrations in the body.
  • the system 300 simulates the body's natural insulin response to blood glucose levels and not only makes efficient use of insulin, but also accounts for other bodily functions as well since insulin has both metabolic and mitogenic effects.
  • Certain embodiments of the system 300 include, without limitation: the insulin infusion device 302 ; a glucose sensor system 308 (e.g., the analyte sensor 112 shown in FIG. 1 ); and at least one controller 310 , which may be incorporated in the insulin infusion device 302 as shown in FIG. 7 .
  • the glucose sensor system 308 generates a sensor signal 314 representative of blood glucose levels 316 in the body 306 , and provides the sensor signal 314 to the at least one controller 310 .
  • the at least one controller 310 receives the sensor signal 314 and generates commands 320 that regulate the timing and dosage of insulin 304 delivered by the insulin infusion device 302 .
  • the commands 320 are generated in response to various factors, variables, settings, and control algorithms utilized by the insulin infusion device 302 .
  • the commands 320 (and, therefore, the delivery of insulin 304 ) can be influenced by a target glucose setpoint value 322 that is maintained and regulated by the insulin infusion device 302 .
  • the commands 320 (and, therefore, the delivery of insulin 304 ) can be influenced by stress-related information 324 obtained from one or more sources, e.g., the gesture-based physical behavior detection system 104 and/or one or more ancillary systems 106 (see FIG. 1 ).
  • the glucose sensor system 308 includes a continuous glucose sensor, sensor electrical components to provide power to the sensor and generate the sensor signal 314 , a sensor communication system to carry the sensor signal 314 to the at least one controller 310 , and a sensor system housing for the electrical components and the sensor communication system.
  • the glucose sensor system 308 may be implemented as a computer-based or processor-based component having the described configuration and features.
  • the at least one controller 310 includes controller electrical components and software to generate commands for the insulin infusion device 302 based on the sensor signal 314 , the target glucose setpoint value 322 , the stress-related information 324 , and other user-specific parameters, settings, and factors.
  • the at least one controller 310 may include a controller communication system to receive the sensor signal 314 and issue the commands 320 .
  • the insulin infusion device 302 includes a fluid pump mechanism 328 , a fluid reservoir 330 for the medication (e.g., insulin), and an infusion tube to infuse the insulin 304 into the body 306 .
  • the insulin infusion device 302 includes an infusion communication system to handle the commands 320 from the at least one controller 310 , electrical components and programmed logic to activate the fluid pump mechanism 328 motor according to the commands 320 , and a housing to hold the components of the insulin infusion device 302 .
  • the fluid pump mechanism 328 receives the commands 320 and delivers the insulin 304 from the fluid reservoir 330 to the body 306 in accordance with the commands 320 .
  • an embodiment of the insulin infusion device 302 can include additional elements, components, and features that may provide conventional functionality that need not be described herein. Moreover, an embodiment of the insulin infusion device 302 can include alternative elements, components, and features if so desired, as long as the intended and described functionality remains in place. In this regard, as mentioned above with reference to FIG. 6 , the insulin infusion device 302 may be implemented as a computer-based or processor-based components having the described configuration and features, including the display element 214 or other device-specific items 206 as described above.
  • the at least one controller 310 is configured and programmed to regulate the operation of the fluid pump mechanism 328 and other functions of the insulin infusion device 302 .
  • the at least one controller 310 controls the fluid pump mechanism 328 to deliver the fluid medication (e.g., insulin) from the fluid reservoir 330 to the body 306 .
  • the at least one controller 310 can be housed in the infusion device housing, wherein the infusion communication system is an electrical trace or a wire that carries the commands 320 from the at least one controller 310 to the fluid pump mechanism 328 .
  • the at least one controller 310 can be housed in the sensor system housing, wherein the sensor communication system is an electrical trace or a wire that carries the sensor signal 314 from the sensor electrical components to the at least one controller 310 .
  • the at least one controller 310 has its own housing or is included in a supplemental or ancillary device.
  • the at least one controller 310 , the insulin infusion device 302 , and the glucose sensor system 308 are all located within one common housing.
  • the gesture-based physical behavior detection system 104 employs at least one sensor to obtain corresponding user-specific sensor data.
  • the obtained user-specific sensor data is processed or analyzed by the gesture-based physical behavior detection system 104 and/or by another suitably configured device or component of the system 100 to determine whether the user is under stress (and, if so, how stressed).
  • the obtained user-specific sensor data may also be processed or analyzed to obtain certain stress-related parameters, characteristics, and/or metadata for the user.
  • the obtained user-specific sensor data may identify, include, or indicate any or all of the following, without limitation: timestamp data corresponding to periods of stress; a type, category, or classification of the physical behavior or activity that is contemporaneous with periods of stress; location data; user posture or position information; etc.
  • the gesture-based physical behavior detection system 104 may include, cooperate with, or be realized as a motion-based physical behavior detection system, an activity-based physical behavior detection system, an image or video based activity detection system, or the like.
  • the system 104 may be realized as a unitary “self-contained” wearable system that communicates with one or more other components of the system 100 .
  • the system 104 can be implemented with at least one wearable device such as an activity monitor device, a smart watch device, a smart bracelet device, or the like.
  • the system 104 may be realized as at least one portable or wearable device that includes or communicates with one or more external or ancillary sensor devices, units, or components.
  • the system 104 can be implemented with a wearable or portable smart device that is linked with one or more external sensors worn or carried by the user.
  • a wearable or portable smart device that is linked with one or more external sensors worn or carried by the user.
  • US 2020/0135320 and United States patent publication number US 2020/0289373 disclose gesture-based physical behavior detection systems that are suitable for use as the system 104 ; the entire content of these United States patent documents is incorporated by reference herein.
  • FIG. 8 is a block diagram representation of a gesture-based physical behavior detection system 400 arranged in accordance with certain embodiments.
  • the system 400 is suitable for use with the system 100 shown FIG. 1 .
  • the system 400 is deployed as a wearable electronic device in the form factor of a bracelet or wristband that is worn around the wrist or arm of a user's dominant hand.
  • the system 400 may optionally be implemented using a modular design, wherein individual modules include one or more subsets of the disclosed components and overall functionality. The user may choose to add specific modules based on personal preferences and requirements.
  • the system 400 includes a battery 402 and a power management unit (PMU) 404 to deliver power at the proper supply voltage levels to all electronic circuits and components.
  • the PMU 404 may also include battery-recharging circuitry.
  • the PMU 404 may also include hardware, such as switches, that allows power to specific electronics circuits and components to be cut off when not in use.
  • circuitry and components in the system 400 are switched off to conserve power. Only circuitry and components that are required to detect or help predict the start of a behavior event of interest may remain enabled. For example, if no motion is being detected, all sensor circuits but an accelerometer 406 may be switched off and the accelerometer 406 may be put in a low-power wake-on-motion mode or in another lower power mode that consumes less power and uses less processing resources than its high performance active mode. A controller 408 of the system 400 may also be placed into a low-power mode to conserve power.
  • the accelerometer 406 and/or the controller 408 may switch into a higher power mode and additional sensors such as, for example, a gyroscope 410 and/or a proximity sensor 412 may also be enabled.
  • additional sensors such as, for example, a gyroscope 410 and/or a proximity sensor 412 may also be enabled.
  • memory variables for storing event-specific parameters such as gesture types, gesture duration, etc. can be initialized.
  • the accelerometer 406 upon detection of user motion, switches into a higher power mode, but other sensors remain switched off until the data from the accelerometer 406 indicates that the start of a behavior event has likely occurred. At that point in time, additional sensors such as the gyroscope 410 and the proximity sensor 412 may be enabled.
  • both the accelerometer 406 and gyroscope 410 are enabled but at least one of either the accelerometer 406 or the gyroscope 410 is placed in a lower power mode compared to their regular power mode. For example, the sampling rate may be reduced to conserve power.
  • the circuitry required to transfer data from the system 400 to a destination device may be placed in a lower power mode.
  • radio circuitry 414 could be disabled.
  • the circuitry required to transfer data from the system 400 may be placed in a lower power mode.
  • the radio circuitry 414 could be disabled until a possible or likely start of a behavior event has been determined. Alternatively, it may remain enabled but in a low power state to maintain the connection between the system 400 and one or more other components of the system 100 , but without transferring user status data, sensor data, or the like.
  • all motion-detection related circuitry may be switched off if, based on certain metadata, it is determined that the occurrence of a particular behavior event, such as a food intake event, is unlikely. This may be desirable to further conserve power.
  • Metadata used to make this determination may, among other things, include one or more of the following: time of the day, location, ambient light levels, proximity sensing, and detection that the system 400 has been removed from the wrist or hand, detection that the system 400 is being charged, or the like. Metadata may be generated and collected by the system 400 . Alternatively, metadata may be collected by another device that is external to the system 400 and is configured to directly or indirectly exchange information with the system 400 .
  • Metadata is generated and collected by the system 400
  • other metadata is generated and collected by a device that is external to the system 400
  • the system 400 may periodically or from time to time power up its radio circuitry 414 to retrieve metadata related information from another device.
  • some or all of the sensors may be turned on or placed in a higher power mode if certain metadata indicates that the occurrence of a particular behavior event, such as the user beginning to work, jog, or eat, is likely. Metadata used to make this determination may, among other things, include one or more of the following: time of the day; location; ambient light levels; proximity sensing; historical user behavior patterns. Some or all of the metadata may be collected by the system 400 or by an ancillary device that cooperates or communicates with the system 400 , as mentioned above.
  • User status data used to track certain aspects of a user's behavior may be stored locally inside memory 416 of the system 400 and processed locally using the controller 408 of the system 400 .
  • User status data may also be transferred to the medication delivery system 102 , the patient care application 110 , and/or one or more of the database 114 mentioned above with reference to FIG. 1 (such that the user status data can be processed, analyzed, or otherwise utilized by the applications or components that receive the user status data). It is also possible that some of the processing and analysis are performed locally by the system 400 , while further processing and analysis are performed by one or more other components of the system 100 .
  • the detection of the start of a behavior event may trigger the power up and/or activation of additional sensors and circuitry, such as a camera 418 .
  • Power up and/or activation of additional sensors and circuitry may occur at the same time as the detection of the behavior event of interest or some time thereafter.
  • Specific sensors and circuitry may be turned on only at specific times during a detected event, and may be switched off otherwise to conserve power. It is also possible that the camera 418 only gets powered up or activated upon explicit user intervention such as, for example, pushing and holding a button 420 . Releasing the button 420 may turn off the camera 418 to conserve power.
  • a projecting light source 422 may also be enabled to provide visual feedback to the user about the area that is within view of the camera or to otherwise illuminate the field of view.
  • the projecting light source 422 may only be activated sometime after the camera 418 has been activated.
  • additional conditions may need to be met before the projecting light source 422 is activated. Such conditions may include: the determination that the projecting light source 422 is likely aiming in the direction of the object of interest; the determination that the system 400 is not moving excessively; or the like.
  • one or more light emitting diodes (LEDs) 426 may be used as the projecting light source 422 .
  • Images may be tagged with additional information or metadata such as: camera focal information; proximity information from the proximity sensor 412 ; ambient light levels information from an ambient light sensor 424 ; timestamp information; etc.
  • additional information or metadata may be used during the processing and analysis of the user status data.
  • the projecting light source 422 may also be used to communicate other information.
  • an ancillary device may use inputs from one or more proximity sensors 412 , process those inputs to determine if the camera 418 is within the proper distance range from the object of interest, and use one or more light sources to communicate that the camera is within the proper distance range, that the user needs to increase the distance between camera and the object of interest, or that the user needs to reduce the distance between the camera and the object of interest.
  • the projecting light source 422 may also be used in combination with the ambient light sensor 424 to communicate to the user if the ambient light is insufficient or too strong for an adequate quality image capture.
  • the projecting light source 422 may also be used to communicate information including, but not limited to, a low battery situation or a functional defect.
  • the projecting light source 422 may also be used to communicate dietary coaching information.
  • the projecting light source 422 might, among other things, indicate if not enough or too much time has expired since a previous food intake event, or may communicate to the user how he/she is doing against specific dietary goals.
  • Signaling mechanisms to convey specific messages using one or more projecting light sources 422 may include, but are not limited to, one or more of the following: specific light intensities or light intensity patterns; specific light colors or light color patterns; specific spatial or temporal light patterns. Multiple mechanisms may also be combined to signal one specific message.
  • a microphone 428 may be used by the user to add specific or custom labels or messages to a detected event and/or image.
  • audio captured by the microphone 428 can be processed to assist in the determination of whether the user is eating or drinking.
  • Audio snippets may be processed by a voice recognition engine.
  • the accelerometer 406 may, in addition to tracking at least one parameter that is directly related to a gesture-based behavior event, also be used to track one or more parameters that are not directly related to that particular event.
  • Such parameters may, among other things, include physical activity, sleep, stress, or illness.
  • the system 400 may include or cooperate with any number of other sensors 430 as appropriate for the particular embodiment.
  • the system 400 may include or cooperate with any or all of the following: a heartrate monitor; a physiological characteristic or analyte sensor; a continuous glucose monitor; a GPS receiver; and any other sensor, monitor, or detector mentioned elsewhere herein.
  • the system 400 obtains user status data from one or more of its sensors, detectors, and sources, wherein the user status data indicates a stressful activity of the user.
  • the user status data can be analyzed and processed by the system 400 (and/or by one or more other components of the system 100 ) to determine whether the user is under stress, in certain embodiments, to determine additional information, characteristics, or metrics related to the user's stress level, stressful conditions, etc.
  • the system 400 and/or an ancillary system 106 or device determines the user's stress activity status primarily based on the output of user-worn motion sensors, movement sensors, one or more inertial sensors (e.g., one or more accelerometers and/or one or more gyroscopes), one or more GPS sensors, one or more magnetometers, one or more force or physical pressure sensors, or the like, which are suitably configured, positioned, and arranged to measure physical movement or motion of the user's limbs, digits, joints, facial features, head, and/or other body parts.
  • inertial sensors e.g., one or more accelerometers and/or one or more gyroscopes
  • GPS sensors e.g., one or more GPS sensors
  • magnetometers e.g., one or more magnetometers
  • force or physical pressure sensors e.g., a force or physical pressure sensors
  • the system 400 includes at least one haptic interface 440 that is suitably configured and operated to provide haptic feedback as an output.
  • the at least one haptic interface 440 generates output(s) that can be experienced by the sense of touch by the user, e.g., mechanical force, vibration, movement, temperature changes, or the like.
  • Haptic feedback generated by the at least one haptic interface 440 may represent or be associated with one or more of the following, without limitation: reminders; alerts; confirmations; notifications; messages; numerical values (such as measurements); status indicators; or any other type of output provided by the system 400 .
  • the user status data (e.g., sensor data) is provided to a gesture recognizer unit or processor.
  • sensor data may be sent in raw format.
  • a source of sensor data may perform some processing (e.g., filtering, compression, or formatting) on raw sensor data before sending the processed sensor data to the gesture recognizer unit.
  • the gesture recognizer unit analyzes the incoming sensor data and converts the incoming sensor data into a stream of corresponding gestures, which may be predetermined or otherwise classified or categorized.
  • the gesture recognizer unit may use one or more ancillary inputs (such as the output from one or more ancillary systems 106 ) to aid in the gesture determination process.
  • ancillary input include: time of day; the probability of a specific gesture occurring based on statistical analysis of historical gesture data for that user; geographical location; heart rate; and/or other physiological sensor inputs. Other ancillary inputs are also possible.
  • the event detector analyzes the incoming stream of gestures to determine if the start of an event of interest (e.g., a period of stress or stressful activity) has occurred, whether an event is ongoing, whether an event has ended, or the like.
  • the event detector is implemented as a stress detector intended to capture stressful events, it will be suitably configured to determine if the start of a stressful period has occurred, if the user is still under stress, or if the user has stopped the stressful activity.
  • the gesture-based physical behavior detection system 400 may be suitably configured to monitor other types of physical behavior or activities. Such activities include, without limitation: eating; reading; sleeping; smoking; getting dressed; turning down a bed; making a bed; brushing teeth; combing hair; talking on the phone; inhaling or injecting a medication; and activities related to hand hygiene or personal hygiene.
  • certain functions, features, and/or therapy related operations of the medication delivery device 102 can be adjusted or modified in response to the output of the gesture-based physical behavior detection system 104 and/or the output of at least one ancillary system 106 . More specifically, operation of the medication delivery device 102 can be controlled or regulated based on a determination that the user is experiencing stress. For example, if the system 100 determines that the user is under stress, then the therapy control algorithm of the medication delivery device 102 can be adjusted or replaced with a stress-correlated therapy control algorithm that compensates for the detected stress.
  • FIG. 9 is a flow chart that illustrates an automated method of operating a medication delivery system.
  • the example described here represents a control process 500 for a device that delivers insulin to a user, such as the insulin infusion device 302 shown in FIG. 7 .
  • a control process 500 for a device that delivers insulin to a user, such as the insulin infusion device 302 shown in FIG. 7 .
  • This description assumes that the insulin infusion device is operating in a first mode of operation to automatically deliver the insulin medication to the user in accordance with a therapy control algorithm (task 502 ).
  • a therapy control algorithm task 502
  • insulin delivery is controlled and regulated in accordance with various factors, sensor data, user-specific settings, a target glucose setpoint value, etc.
  • the baseline therapy control algorithm may include, utilize, or be defined by certain parameters, constants, thresholds, variables, limits, or the like, some of which may be user-specific, and some of which may be adjustable or dynamic in nature to alter the aggressiveness of insulin therapy.
  • a nominal or default therapy control algorithm can be used when the user is relatively stress free, under little to no stress, or within a threshold amount from an average stress level for that user.
  • the default therapy control algorithm can be adjusted or modified, or a different stress-correlated therapy control algorithm can be utilized, when the user is experiencing stress.
  • the process 500 receives stress-identifying data that indicates a current stress status of the user (task 504 ) and analyzes or processes at least some of the received stress-identifying data (e.g., the stress-related information 324 of FIG. 7 ) to determine whether the user is under stress while the insulin infusion device is operating in the first mode (query task 506 ).
  • the stress-identifying data is generated by sensors, detector units, or other sources of data that are included with or associated with a suitably configured gesture-based physical behavior detection system 400 (e.g., the accelerometer 406 , the gyroscope 410 , the proximity sensor 412 , one or more other sensors 430 , the microphone 428 , and/or the camera 418 ).
  • the stress-identifying data may be generated at least in part from gesture data obtained for the user.
  • the stress-identifying data may include user status data generated or provided by at least one ancillary system 106 or device (other than the gesture-physical behavior detection system 400 ) that monitors certain characteristics, status or condition of the user. Accordingly, the stress-identifying data may be generated at least in part from such user status data.
  • the process 500 continues by analyzing or processing at least some of the received stress-identifying data to determine whether the user is under stress (query task 506 ). If analysis of the stress-identifying data indicates that the user is not under stress or that the user's stress level is below a certain threshold (the “No” branch of query task 506 ), then the process 500 continues to operate the insulin infusion device in the first mode, using the same therapy control algorithm. If, however, the stress-identifying data indicates that the user is under stress while the insulin infusion device is operating in the first mode (the “Yes” branch of query task 506 ), then operation of the insulin infusion device is adjusted or changed in a stress-correlated manner to compensate for the detected level of stress, the type of stress, the duration of detected stress, etc.
  • the process 500 changes at least one therapy-altering factor of the currently active therapy control algorithm to obtain an appropriate stress-correlated therapy control algorithm that compensates for the detected stress condition (task 508 ).
  • the process 500 accesses, retrieves, or selects an appropriate stress-correlated therapy control algorithm that compensates for the detected stress condition (task 510 ).
  • changing the existing therapy control algorithm or selecting a new therapy control algorithm may be a function of a stress level of the user, as determined from the stress-identifying data.
  • the process 500 continues by operating the insulin infusion device in a second mode of operation (e.g., under the changed therapy control algorithm or the new therapy control algorithm) to automatically deliver the insulin medication to the user in accordance with the stress-correlated therapy control algorithm (task 512 ).
  • the second mode of operation compensates for user stress as determined from the stress-identifying data. This example assumes that the transition from the first mode of operation to the second mode of operation occurs automatically, and without any user input or involvement. In some embodiments, however, the process 500 may require a user confirmation before transitioning to the second mode of operation.
  • a remote data processing system e.g., a cloud-based system such as the data processing system 116 shown in FIG.
  • the at least one command, instruction, or control signal causes the medication delivery system 102 to transition from the first mode of operation to the second mode of operation.
  • FIG. 9 shows the process 500 leading back to task 504 to receive updated or additional stress-identifying data for continued monitoring and processing during the second mode of operation. If the process 500 determines (from the updated stress-identifying data) that the user is no longer experiencing stress (the “No” branch of query task 506 ), then the baseline therapy delivery algorithm or any suitable algorithm can be utilized to transition away from the second mode of operation and return to the first mode of operation (task 502 ). Thereafter, the process 500 may continue as described above.
  • the process 500 determines whether the user is under stress, based on the received stress-identifying data.
  • the stress-identifying data may include, for example, any of the following: raw (uncharacterized or unprocessed) or processed sensor data generated by the gesture-based physical behavior detection system 104 , 400 ; gesture data provided by the system 104 , 400 ; raw (uncharacterized or unprocessed) or processed sensor data generated by one or more ancillary systems 106 ; and raw (uncharacterized or unprocessed) or processed sensor data generated by the analyte sensor 112 .
  • the device or system that makes the stress determination has already been trained with historical data such that it can compare the received stress-identifying data against historical data, trends, patterns, and/or conditions that are known to be correlated with user stress. For example, if the received stress-identifying data includes gesture data that indicates physical behavior or stress-indicating gestures that historically correspond to work-related activity (e.g., typing, operating heavy machinery, delivering packages, public speaking), calendar data that historically indicates typical working hours, and GPS location data that historically indicates a typical work location, then the process 500 can determine and declare that the user is likely under stress.
  • work-related activity e.g., typing, operating heavy machinery, delivering packages, public speaking
  • calendar data that historically indicates typical working hours
  • GPS location data that historically indicates a typical work location
  • the process 500 can determine and declare that the user is likely under stress (the “Yes” branch of query task 506 ).
  • the process 500 can determine and declare that the user is likely under stress (the “Yes” branch of query task 506 ).
  • the process 500 may declare that the user is currently under stress, identify times during which the user is under stress, indicate the type of stress (e.g., work-related stress, school-related stress, mental stress, physical stress, illness-related stress, or the like), quantify or classify a current stress level or severity, or the like, as appropriate for the particular embodiment and application of the system 100 .
  • type of stress e.g., work-related stress, school-related stress, mental stress, physical stress, illness-related stress, or the like
  • the insulin infusion device transitions to the second mode of operation to deliver insulin in accordance with the stress-correlated therapy control algorithm.
  • the stress-correlated therapy control algorithm increases the aggressiveness of the insulin therapy provided by the insulin infusion device, relative to the baseline therapy control algorithm that is utilized for the first mode of operation. Increasing the aggressiveness is desirable to counteract any stress-induced increase in blood glucose.
  • one or more settings, parameters, or variables can be adjusted based on stress detection. For example, the user's target glucose setpoint value can be adjusted, or controller gain values (which are utilized by the automatic insulin delivery control algorithm) can be adjusted as a function of stress detection and/or certain stress-related characteristics.
  • the controller of the insulin infusion device employs a proportional-integral-derivative insulin feedback (PID-IFB) control algorithm designed for continuous closed-loop insulin delivery control.
  • PID-IFB control algorithm include PID gain values that are applied to an error term, a time derivative of sensor glucose term, and an integral error term (which is the integral action on historical errors between sensor glucose readings and the controller setpoint, such as 100 mg/dL).
  • integral error term which is the integral action on historical errors between sensor glucose readings and the controller setpoint, such as 100 mg/dL.
  • certain implementations of the PID-IFB control algorithm calculate the IFB using time constants that can be adjusted based on stress detection or observed/measured stress characteristics.
  • Umax can also be adjusted based on stress detection or observed/measured stress characteristics.
  • the controller gain values, Umax, and/or time constants can be regulated to make the controller more or less responsive to changes in sensor glucose measurements during periods of user stress. It should be appreciated that insulin therapy can be changed in other ways based on detected user stress, and that the examples provided here are neither exhaustive nor limiting.
  • aggressiveness of the insulin therapy provided by the insulin infusion device is correlated with (e.g., proportional to) a measure or level of stress that is detected by the process 500 .
  • the control algorithm and/or therapy delivery actions can be adjusted, controlled, or regulated in a different manner using any desired methodology that is driven by the detected stress level. For example, increased aggressiveness may be associated with enabling automatic basal insulin delivery by the insulin infusion device, temporarily increasing the basal rate of insulin, enabling an automatic bolus delivery feature, administering or recommending an additional insulin bolus, lowering the target glucose setpoint value, temporarily increasing insulin limits that govern the delivery of insulin to the user, or the like.
  • less aggressive insulin therapy may be associated with capped or limited insulin boluses, using an upper limit on a current sensor glucose value for purposes of calculating and administering an automatic bolus, or temporarily using a modified basal delivery profile (e.g., a reduced or flat profile) for the delivery of insulin.
  • a modified basal delivery profile e.g., a reduced or flat profile
  • FIG. 10 is a flow chart that illustrates a training process 600 according to certain embodiments.
  • the system 100 can be initialized or trained with historical data for purposes of determining whether the user is under stress, based on obtained gesture data and/or ancillary user status data. Accordingly, the process 600 can be employed with certain embodiments to train the stress detection feature. It should be appreciated that other methodologies, including those that need not employ “training” per se, can be utilized in an implementation of the system 100 .
  • the process 600 obtains gesture training data, which is provided by the gesture-based physical behavior detection system 104 , 400 during one or more training sessions or periods of time (task 602 ).
  • the process 600 obtains ancillary user status training data (e.g., location, date/time, heart rate, blood pressure, sweat, temperature, physical activity, exercise, or any other type of ancillary data 106 described above with reference to FIG. 1 ), which is provided by one or more ancillary systems 106 during one or more training sessions or periods of time (task 604 ).
  • the process 600 also obtains stress marker data, which may be entered by the user, during the training sessions or periods of time (task 606 ).
  • the stress marker data can be obtained in response to the user interacting with one or more user devices 108 to record, flag, mark, or otherwise identify points in time or periods of time at which the user is under stress.
  • the stress marker data may also include information that characterizes or describes the type of stress, the severity or magnitude of the stress (e.g., low, average, or high stress level), or other metadata related to the recorded stress.
  • the process 600 may continue by temporally correlating the obtained training data (e.g., gesture training data and/or ancillary user training data) with the obtained stress marker data (task 608 ).
  • the temporal correlation can be utilized to identify and record certain stress-indicating gestures performed by the user while under stress and/or to identify and record user status information obtained during the marked period of user stress (task 610 ), along with corresponding time/date stamp data.
  • different combinations of training data can be used to identify or classify a stress event during training. For example, if the user's heart rate, sweat level, and body temperature exceed certain threshold values and other data (such as calendar information, location information, or date/time information) indicate a potentially stressful condition, then the process 600 can define the particular combination of data as a stress event. Accordingly, different combinations of physiological data, gesture data, and non-physiological data, and applicable thresholds or measurement ranges, can be analyzed for purposes of characterizing and defining stress-related events for a particular user.
  • the training process 600 may continue by correlating user stress levels and/or periods of stress with historical user data related to physiological condition, measured analyte levels (e.g., obtained from the analyte sensor 112 ), therapy outcomes, fitness or activity logs, or the like (task 612 ).
  • the system 100 can be trained in a way that links detectable periods of user stress to the operation and control of the medication delivery system 102 .
  • the process 600 may generate and save one or more stress-correlated therapy control algorithms, settings, device configurations, or the like (task 614 ).
  • a stress-correlated therapy control algorithm can be used going forward (after the training period ends) in response to the automatic detection of user stress.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Physiology (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Vascular Medicine (AREA)
  • Anesthesiology (AREA)
  • Hematology (AREA)
  • Otolaryngology (AREA)
  • Optics & Photonics (AREA)
  • Emergency Medicine (AREA)
  • Diabetes (AREA)
  • Multimedia (AREA)
  • Social Psychology (AREA)
  • Infusion, Injection, And Reservoir Apparatuses (AREA)

Abstract

A method of operating a system having a fluid pump mechanism and a related controller involves: operating the system in a first mode to automatically deliver the medication in accordance with a therapy control algorithm; and receiving stress-identifying data generated at least in part from gesture data for the user. The gesture data is provided by a gesture-based physical behavior detection system. The method determines, from the stress-identifying data, that the user is under stress while the medication delivery system is operating in the first mode. In response to detecting stress, the system is operated in a second mode to automatically deliver the medication to the user in accordance with a stress-correlated therapy control algorithm. The second mode compensates for user stress as determined from the stress-identifying data.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of U.S. provisional patent application No. 62/948,015, filed Dec. 13, 2019.
  • TECHNICAL FIELD
  • The present technology is generally related to the control, operation, and regulation of a medication delivery system in a way that leverages an automated stress detection system.
  • BACKGROUND
  • Medical therapy delivery systems, such as fluid infusion devices, are relatively well known in the medical arts for use in delivering or dispensing an agent, such as insulin or another prescribed medication, to a patient. A typical insulin infusion device includes a fluid pump mechanism and an associated drive system that actuates a plunger or piston of a fluid reservoir to deliver fluid medication from the reservoir to the body of a patient via a fluid delivery conduit between the reservoir and the body of a patient. Use of infusion pump therapy has been increasing, especially for delivering insulin to diabetic patients.
  • Control schemes have been developed to allow insulin infusion devices to monitor and regulate a patient's blood glucose level in a substantially continuous and autonomous manner. An insulin infusion device can be operated in an automatic mode wherein basal insulin is delivered at a rate that is automatically adjusted for the user. Moreover, an insulin infusion device can be operated to automatically calculate, recommend, and deliver insulin boluses as needed (e.g., to compensate for meals consumed by the user). Ideally, the amount of an insulin bolus should be accurately calculated and administered to maintain the user's blood glucose within the desired range. In particular, an automatically generated and delivered insulin bolus should safely manage the user's blood glucose level and keep it above a defined threshold level. To this end, an insulin infusion device operating in an automatic mode uses continuous glucose sensor data and control algorithms to regulate the user's blood glucose, based on a target glucose setpoint setting and user-initiated meal announcements that typically include estimations of the amount of carbohydrates to be consumed in an upcoming meal.
  • BRIEF SUMMARY
  • The subject matter of this disclosure generally relates to a system that automatically detects when a user is experiencing stress and, in response to the detection, regulates, controls, or adjusts the operation of a medication delivery system in a stress-correlated manner.
  • In one aspect, the present disclosure provides a method of operating a medication delivery system having a fluid pump mechanism and at least one controller that regulates operation of the fluid pump mechanism to deliver medication from the medication delivery system. The method involves: operating the medication delivery system in a first mode of operation to automatically deliver the medication to a user in accordance with a therapy control algorithm; receiving stress-identifying data that indicates a current stress status of the user, the stress-identifying data generated at least in part from gesture data for the user, the gesture data provided by a gesture-based physical behavior detection system; determining, from the stress-identifying data, that the user is under stress while the medication delivery system is operating in the first mode; and in response to the determining, operating the medication delivery system in a second mode of operation to automatically deliver the medication to the user in accordance with a stress-correlated therapy control algorithm. The second mode of operation compensates for user stress as determined from the stress-identifying data.
  • In another aspect, the disclosure provides a medication delivery system having: a fluid pump mechanism; at least one controller that regulates operation of the fluid pump mechanism to deliver insulin from the medication delivery system; and at least one memory element associated with the at least one controller, the at least one memory element storing processor-executable instructions configurable to be executed by the at least one controller to perform a method of controlling operation of the medication delivery system. The method involves: operating the medication delivery system in a first mode of operation to automatically deliver the insulin to a user in accordance with a therapy control algorithm; receiving stress-identifying data that indicates a current stress status of the user, the stress-identifying data generated at least in part from gesture data for the user, the gesture data provided by a gesture-based physical behavior detection system; determining, from the stress-identifying data, that the user is under stress while the medication delivery system is operating in the first mode; and in response to the determining, operating the medication delivery system in a second mode of operation to automatically deliver the insulin to the user in accordance with a stress-correlated therapy control algorithm, wherein the second mode of operation compensates for user stress as determined from the stress-identifying data.
  • In another aspect, the disclosure provides a system having: an insulin infusion device that regulates delivery of insulin to a user; a gesture-based physical behavior detection system configured to generate gesture data for the user, and configured to communicate the gesture data; and at least one controller that controls operation of the insulin infusion device. The at least one controller is configured to: operate the insulin infusion device in a first mode of operation to automatically deliver the insulin to the user in accordance with a therapy control algorithm; process stress-identifying data that indicates a current stress status of the user, the stress-identifying data generated at least in part from gesture data provided by the gesture-based physical behavior detection system; determine, from the stress-identifying data, that the user is under stress while the insulin infusion device is operating in the first mode; and in response to determining that the user is under stress, operate the insulin infusion device in a second mode of operation to automatically deliver the insulin to the user in accordance with a stress-correlated therapy control algorithm. The second mode of operation compensates for user stress as determined from the stress-identifying data.
  • The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a simplified block diagram representation of an exemplary embodiment of a system that includes a medication delivery system that responds to patient stress levels as indicated by the output of a gesture-based physical behavior detection system;
  • FIG. 2 is a plan view of an exemplary embodiment of an insulin infusion device that is suitable for use as the medication delivery system shown in FIG. 1;
  • FIG. 3 is a top perspective view of an embodiment of an insulin infusion device implemented as a patch pump device that is suitable for use as the medication delivery system shown in FIG. 1;
  • FIG. 4 is a perspective view of an exemplary embodiment of a smart insulin pen that is suitable for use as the medication delivery system shown in FIG. 1;
  • FIG. 5 is a perspective view of an exemplary embodiment of a smart pen accessory that is suitable for use with the medication delivery system shown in FIG. 1;
  • FIG. 6 is a block diagram representation of an exemplary embodiment of a computer-based or processor-based device suitable for deployment in the system shown in FIG. 1;
  • FIG. 7 is a block diagram representation of a closed loop glucose control system arranged in accordance with certain embodiments;
  • FIG. 8 is a block diagram representation of a gesture-based physical behavior detection system arranged in accordance with certain embodiments;
  • FIG. 9 is a flow chart that illustrates an infusion device control process according to certain embodiments; and
  • FIG. 10 is a flow chart that illustrates a training process according to certain embodiments.
  • DETAILED DESCRIPTION
  • The following detailed description is merely illustrative in nature and is not intended to limit the embodiments of the subject matter or the application and uses of such embodiments. As used herein, the word “exemplary” means “serving as an example, instance, or illustration.” Any implementation described herein as exemplary is not necessarily to be construed as preferred or advantageous over other implementations. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description.
  • It should be understood that various aspects disclosed herein may be combined in different arrangements than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the techniques). In addition, while certain aspects of this disclosure are described as being performed by a single module or unit for purposes of clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a medical device.
  • In one or more examples, the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
  • Instructions may be configurable to be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
  • Techniques and technologies may be described herein in terms of functional and/or logical block components, and with reference to symbolic representations of operations, processing tasks, and functions that may be performed by various computing components or devices. Such operations, tasks, and functions are sometimes referred to as being computer-executed, computerized, software-implemented, or computer-implemented. It should be appreciated that the various block components shown in the figures may be realized by any number of hardware, software, and/or firmware components configured to perform the specified functions. For example, an embodiment of a system or a component may employ various integrated circuit components, e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
  • FIG. 1 is a simplified block diagram representation of an exemplary embodiment of a system 100 that responds to patient stress levels by adjusting at least one setting, function, or therapy-related operation of a medication delivery system 102. In certain embodiments, the medication delivery system 102 responds to patient stress levels as indicated by the output of a gesture-based physical behavior detection system 104 and/or the output of at least one ancillary sensor, detector, or measurement system 106 (hereinafter referred to as ancillary system(s) 106). Certain embodiments of the system 100 include, without limitation: the medication delivery system 102 (or device) that regulates delivery of medication to a user; at least one gesture-based physical behavior detection system 104 that monitors user behavior and/or status to obtain gesture data that indicates whether the user is under stress; at least one ancillary system 106; at least one user device 108 that includes or cooperates with a suitably written and configured patient care application 110; an analyte sensor 112 to measure a physiological characteristic of the user, such that sensor data obtained from the analyte sensor 112 can be used to control, regulate, or otherwise influence the operation of the medication delivery system 102; and at least one patient history and outcomes database 114. In accordance with certain cloud-implemented embodiments, the system includes at least one data processing system 116, which may be in communication with any or all of the other components of the system 100. Other configurations and topologies for the system 100 are also contemplated here, such as a system that includes additional intermediary, interface, or data repeating devices in the data path between a sending device and a receiving device.
  • At least some of the components of the system 100 are communicatively coupled with one another to support data communication, signaling, and/or transmission of control commands as needed, via at least one communications network 120. The at least one communications network 120 may support wireless data communication and/or data communication using tangible data communication links. FIG. 1 depicts network communication links in a simplified manner. In practice, the system 100 may cooperate with and leverage any number of wireless and any number of wired data communication networks maintained or operated by various entities and providers. Accordingly, communication between the various components of the system 100 may involve multiple network links and different data communication protocols. In this regard, the network can include or cooperate with any of the following, without limitation: a local area network; a wide area network; the Internet; a personal area network; a near-field data communication link; a cellular communication network; a satellite communication network; a video services or television broadcasting network; a network onboard a vehicle; or the like. The components of the system 100 may be suitably configured to support a variety of wireless and wired data communication protocols, technologies, and techniques as needed for compatibility with the at least one communication network 120.
  • The system 100 can support any type of medication delivery system 102 that is compatible with the features and functionality described here. For example, the medication delivery system 102 may be realized as a user-activated or user-actuated fluid delivery device, such as a manual syringe, an injection pen, or the like. As another example, the medication delivery system 102 may be implemented as an electronic device that is operated to regulate the delivery of medication fluid to the user. In certain embodiments, however, the medication delivery system 102 includes or is realized as an insulin infusion device, e.g., a portable patient-worn or patient-carried insulin pump. In such embodiments, the analyte sensor 112 includes or is realized as a glucose meter, a glucose sensor, or a continuous glucose monitor. For the sake of brevity, conventional techniques related to insulin infusion device operation, infusion set operation, and other functional aspects of the systems (and the individual operating components of the systems) may not be described in detail here. Examples of infusion pumps may be of the type described in, but not limited to, U.S. Pat. Nos.: 4,562,751; 4,685,903; 5,080,653; 5,505,709; 5,097,122; 6,485,465; 6,554,798; 6,558,320; 6,558,351; 6,641,533; 6,659,980; 6,752,787; 6,817,990; 6,932,584; and 7,621,893; each of which are herein incorporated by reference.
  • FIG. 2 is a plan view of an exemplary embodiment of an insulin infusion device 130 suitable for use as the medication delivery system 102 shown in FIG. 1. The insulin infusion device 130 is a portable medical device designed to be carried or worn by the patient. The illustrated embodiment of the insulin infusion device 130 includes a housing 132 adapted to receive an insulin-containing reservoir (hidden from view in FIG. 2). An opening in the housing 132 accommodates a fitting 134 (or cap) for the reservoir, with the fitting 134 being configured to mate or otherwise interface with tubing 136 of an infusion set 138 that provides a fluid path to/from the body of the user. In this manner, fluid communication from the interior of the insulin reservoir to the user is established via the tubing 136. The illustrated version of the insulin infusion device 130 includes a human-machine interface (HMI) 140 (or user interface) that includes elements that can be manipulated by the user to administer a bolus of fluid (e.g., insulin), to change therapy settings, to change user preferences, to select display features, and the like. The insulin infusion device 130 also includes a display 142, such as a liquid crystal display (LCD) or another suitable display technology, that can be used to present various types of information or data to the user, such as, without limitation: the current glucose level of the patient; the time; a graph or chart of the patient's glucose level versus time; device status indicators; etc. The insulin infusion device 130 may be configured and controlled to support other features and interactive functions described in more detail below.
  • FIG. 3 is a top perspective view of an embodiment of an insulin infusion device 146 implemented as a patch pump device that is suitable for use as the medication delivery system 102 shown in FIG. 1. The insulin infusion device 146 can be implemented as a combination device that includes an insertable insulin delivery cannula and an insertable glucose sensor (both of which are hidden from view in FIG. 3). In such an implementation, the glucose sensor may take the place of the separate analyte sensor 112 shown in FIG. 1. The insulin infusion device 146 includes a housing 148 that serves as a shell for a variety of internal components. FIG. 3 shows the insulin infusion device 146 with a removable fluid cartridge module 150 installed and secured therein. The housing 148 is suitably configured to receive, secure, and release the removable fluid cartridge module 150. The insulin infusion device 146 includes at least one user interface feature, which can be actuated by the patient as needed. The illustrated embodiment of the insulin infusion device 146 includes a button 152 that is physically actuated. The button 152 can be a multipurpose user interface if so desired to make it easier for the user to operate the insulin infusion device 146. In this regard, the button 152 can be used in connection with one or more of the following functions, without limitation: waking up the processor and/or electronics of the insulin infusion device 146; triggering an insertion mechanism to insert a fluid delivery cannula and/or an analyte sensor into the subcutaneous space or similar region of the user; configuring one or more settings of the insulin infusion device 146; initiating delivery of medication fluid from the fluid cartridge module 150; initiating a fluid priming operation; disabling alerts or alarms generated by the insulin infusion device 146; and the like. In lieu of the button 152, the insulin infusion device 146 can employ a slider mechanism, a pin, a lever, a switch, a touch-sensitive element, or the like. In certain embodiments, the insulin infusion device 146 may be configured and controlled to support other features and interactive functions described in more detail below.
  • FIG. 4 is a perspective view of an exemplary embodiment of a smart insulin pen 160 suitable for use as the medication delivery system shown in FIG. 1. The pen 160 includes an injector body 162 and a cap 164. FIG. 4 shows the cap 164 removed from the injector body 162, such that a delivery needle 166 is exposed. The pen 160 includes suitably configured electronics and processing capability to communicate with an application running on a user device, such as a smartphone, to support various functions and features such as: tracking active insulin; calculating insulin dosages (boluses); tracking insulin dosages; monitoring insulin supply levels; patient reminders and notifications; and patient status reporting. In certain embodiments, the smart insulin pen 160 can receive insulin dosage recommendations or instructions and/or recommended dosing times (or a recommended dosing schedule). Moreover, the smart insulin pen 160 may be configured and controlled to support other features and interactive functions described in more detail below.
  • FIG. 5 is a perspective view of an exemplary embodiment of a smart pen accessory 170 that is suitable for use with the medication delivery system 102 shown in FIG. 1. In particular, the smart pen accessory 170 cooperates with a “non-smart” insulin pen that lacks the intelligence and functionality of a smart insulin pen (as described above). The smart pen accessory 170 can be realized as a pen cap, a clip-on apparatus, a sleeve, or the like. The smart pen accessory 170 is attached to an insulin pen 172 such that the smart pen accessory 170 can measure the amount of insulin delivered by the insulin pen 172. The insulin dosage data is stored by the smart pen accessory 170 along with corresponding date/time stamp information. In certain embodiments, the smart pen accessory 170 can receive, store, and process additional patient-related or therapy-related data, such as glucose data. Indeed, the smart pen accessory 170 may also support various features and functions described above in the context of the smart insulin pen 160. For example, the smart pen accessory 170 may be configured to receive insulin dosage recommendations or instructions and/or recommended dosing times (or a recommended dosing schedule). Moreover, the smart pen accessory 170 may be configured and controlled to support other features and interactive functions described in more detail below.
  • Generally, a fluid infusion device (such as an insulin infusion device) includes a fluid pump mechanism having a motor or other actuation arrangement that is operable to linearly displace a plunger (or stopper) of a fluid reservoir provided within the fluid infusion device to deliver a dosage of fluid medication, such as insulin, to the body of a user. Dosage commands that govern operation of the motor may be generated in an automated manner in accordance with the delivery control scheme associated with a particular operating mode, and the dosage commands may be generated in a manner that is influenced by a current (or most recent) measurement of a physiological condition in the body of the user. For a glucose control system suitable for use by diabetic patients, a closed-loop or automatic operating mode can be used to generate insulin dosage commands based on a difference between a current (or most recent) measurement of the interstitial fluid glucose level in the body of the user and a target (or reference) glucose setpoint value. In this regard, the rate of infusion may vary as the difference between a current measurement value and the target measurement value fluctuates. For purposes of explanation, the subject matter is described herein in the context of the infused fluid being insulin for regulating a glucose level of a user (or patient); however, it should be appreciated that many other fluids may be administered through infusion, and the subject matter described herein is not necessarily limited to use with insulin.
  • The analyte sensor 112 may communicate sensor data to the medication delivery system 102 for use in regulating or controlling operation of the medication delivery system 102. Alternatively or additionally, the analyte sensor 112 may communicate sensor data to one or more other components in the system 100, such as, without limitation: a user device 108 (for use with the patient care application 110); a data processing system 116; and/or a patient history and outcomes database 114.
  • The system 100 can support any number of user devices 108 linked to the particular user or patient. In this regard, a user device 108 may be, without limitation: a smartphone device; a laptop, desktop, or tablet computer device; a medical device; a wearable device; a global positioning system (GPS) receiver device; a system, component, or feature onboard a vehicle; a smartwatch device; a television system; a household appliance; a video game device; a media player device; or the like. For the example described here, the medication delivery system 102 and the at least one user device 108 are owned by, operated by, or otherwise linked to a user/patient. Any given user device 108 can host, run, or otherwise execute the patient care application 110. In certain embodiments, for example, the user device 108 is implemented as a smartphone with the patient care application 110 installed thereon. In accordance with another example, the patient care application 110 is implemented in the form of a website or webpage, e.g., a website of a healthcare provider, a website of the manufacturer, supplier, or retailer of the medication delivery system 102, or a website of the manufacturer, supplier, or retailer of the analyte sensor 112. In accordance with another example, the medication delivery system 102 executes the patient care application 110 as a native function.
  • In certain embodiments, at least some of the features or output of the gesture-based physical behavior detection system 104 and/or the ancillary system(s) 106 can be used to influence features, functions, and/or therapy-related operations of the medication delivery system 102. In particular, the systems 104, 106 may be suitably configured and operated to generate and provide output (e.g., data, control signals, markers, or flags) that indicates user stress, such that the medication delivery system 102 can dynamically respond in a stress-correlated manner to compensate for detected user stress.
  • As described in more detail below, the gesture-based physical behavior detection system 104 includes one or more sensors, detectors, measurement devices, and/or readers to automatically detect certain user gestures that correlate to user stress (e.g., work-related physical activity, commuting, arguing, fighting, stress or nervous eating, stress or nervous drinking) or lack thereof (e.g., napping, normal eating or drinking, painting, jogging, or dancing). The gesture-based physical behavior detection system 104 may communicate gesture data to the medication delivery system 102, the user device 108, and/or the data processing system 116 for processing in an appropriate manner for use in regulating or controlling certain functions of the medication delivery system 102. For example, the gesture data may be communicated to a user device 108, such that the user device 108 can process the gesture data and inform the user or the medication delivery system 102 as needed (e.g., remotely regulate or control certain functions of the medication delivery system 102). As another example, the gesture-based physical behavior detection system 104 may communicate the gesture data to one or more cloud computing systems or servers (such as a remote data processing system 116) for appropriate processing and handling in the manner described herein.
  • Similarly, an ancillary system 106 may include one or more sensors, detectors, measurement devices, and/or readers that obtain ancillary user status data that correlates to user stress or lack thereof. In certain embodiments, an ancillary system 106 may include, cooperate with, or be realized as any of the following, without limitation: a heartrate monitor linked to the user; a blood pressure monitor linked to the user; a respiratory rate monitor linked to the user; a vital signs monitor linked to the user; a thermometer (for the user's body temperature and/or the environmental temperature); a sweat detector linked to the user; an activity tracker linked to the user; a global positioning system (GPS); a clock, calendar, or appointment application linked to the user; a pedometer linked to the user; or the like. An ancillary system 106 may be configured and operated to communicate its output (user status data) to one or more components of the system 100 for analysis, processing, and handling in the manner described herein. In certain embodiments, user status data obtained from one or more ancillary systems 106 supplements the gesture data obtained from the gesture-based physical behavior detection system 104, such that periods of user stress and corresponding stress levels are accurately and reliably detected.
  • In certain embodiments, the gesture-based physical behavior detection system 104 and the medication delivery system 102 are implemented as physically distinct and separate components, as depicted in FIG. 1. In such embodiments, the gesture-based physical behavior detection system 104 is external to the medication delivery system 102 and is realized as an ancillary component, relative to the medication delivery system 102. In accordance with alternative embodiments, however, the medication delivery system 102 and the gesture-based physical behavior detection system 104 can be combined into a single hardware component or provided as a set of attached hardware devices. For example, the medication delivery system 102 may include the gesture-based physical behavior detection system 104 or integrate the functionality of the system 104. Similarly, the analyte sensor 112 can be incorporated with the medication delivery system 102 or the gesture-based physical behavior detection system 104. These and other arrangements, deployments, and topologies of the system 100 are contemplated by this disclosure.
  • The at least one patient history and outcomes database 114 includes historical data related to the user's physical condition, physiological response to the medication regulated by the medication delivery system 102, stress-related or stress-correlated factors, and the like. In accordance with embodiments where the medication delivery system 102 is an insulin infusion device and the analyte sensor 112 is a glucose meter, sensor, or monitor, the database 114 can maintain any of the following, without limitation: historical glucose data and corresponding date/time stamp information; insulin delivery and dosage information; user-entered stress markers or indicators; gesture data (provided by the gesture-based physical behavior detection system 104) and corresponding date/time stamp information; ancillary user status data (provided by one or more ancillary systems 106) and corresponding date/time stamp data; diet or food intake history for the user; physical activity data, such as an exercise log; and any other information that may be generated by or used by the system 100 for purposes of controlling the operation of the medication delivery system 102. In certain embodiments, the at least one patient history and outcomes database 114 can receive and maintain training data that is utilized to train, configure, and initialize the system 100 based on historical user behavior, physiological state, operation of the medication delivery system 102, and user-identified periods of stress.
  • A patient history and outcomes database 114 may reside at a user device 108, at the medication delivery system 102, at a data processing system 116, or at any network-accessible location (e.g., a cloud-based database or server system). In certain embodiments, a patient history and outcomes database 114 may be included with the patient care application 110. The patient history and outcomes database 114 enables the system 100 to generate recommendations, warnings, predictions, and guidance for the user and/or to regulate the manner in which the medication delivery system 102 administers therapy to the user, based on detected stress levels and periods of stress.
  • In accordance with certain embodiments, any or all of the components shown in FIG. 1 can be implemented as a computer-based or a processor-based device, system, or component having suitably configured hardware and software written to perform the functions and methods needed to support the features described herein. In this regard, FIG. 6 is a simplified block diagram representation of an exemplary embodiment of a computer-based or processor-based device 200 that is suitable for deployment in the system 100 shown in FIG. 1.
  • The illustrated embodiment of the device 200 is intended to be a high-level and generic representation of one suitable platform. In this regard, any computer-based or processor-based component of the system 100 can utilize the architecture of the device 200. The illustrated embodiment of the device 200 generally includes, without limitation: at least one controller (or processor) 202; a suitable amount of memory 204 that is associated with the at least one controller 202; device-specific items 206 (including, without limitation: hardware, software, firmware, user interface (UI), alerting, and notification features); a power supply 208 such as a disposable or rechargeable battery; a communication interface 210; at least one application programming interface (API) 212; and a display element 214. Of course, an implementation of the device 200 may include additional elements, components, modules, and functionality configured to support various features that are unrelated to the primary subject matter described here. For example, the device 200 may include certain features and elements to support conventional functions that might be related to the particular implementation and deployment of the device 200. In practice, the elements of the device 200 may be coupled together via at least one bus or any suitable interconnection architecture 216.
  • The at least one controller 202 may be implemented or performed with a general purpose processor, a content addressable memory, a microcontroller unit, a digital signal processor, an application specific integrated circuit, a field programmable gate array, any suitable programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination designed to perform the functions described here. Moreover, the at least one controller 202 may be implemented as a combination of computing devices, e.g., a combination of a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other such configuration.
  • The memory 204 may be realized as at least one memory element, device, module, or unit, such as: RAM memory, flash memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. In this regard, the memory 204 can be coupled to the at least one controller 202 such that the at least one controller 202 can read information from, and write information to, the memory 204. In the alternative, the memory 204 may be integral to the at least one controller 202. As an example, the at least one controller 202 and the memory 204 may reside in an ASIC. At least a portion of the memory 204 can be realized as a computer storage medium that is operatively associated with the at least one controller 202, e.g., a tangible, non-transitory computer-readable medium having computer-executable instructions stored thereon. The computer-executable instructions are configurable to be executed by the at least one controller 202 to cause the at least one controller 202 to perform certain tasks, operations, functions, and processes that are specific to the particular embodiment. In this regard, the memory 204 may represent one suitable implementation of such computer-readable media. Alternatively or additionally, the device 200 could receive and cooperate with computer-readable media (not separately shown) that is realized as a portable or mobile component or platform, e.g., a portable hard drive, a USB flash drive, an optical disc, or the like.
  • The device-specific items 206 may vary from one embodiment of the device 200 to another. For example, the device-specific items 206 will support: sensor device operations when the device 200 is realized as a sensor device; smartphone features and functionality when the device 200 is realized as a smartphone; activity tracker features and functionality when the device 200 is realized as an activity tracker; smart watch features and functionality when the device 200 is realized as a smart watch; medical device features and functionality when the device is realized as a medical device; etc. In practice, certain portions or aspects of the device-specific items 206 may be implemented in one or more of the other blocks depicted in FIG. 6.
  • If present, the UI of the device 200 may include or cooperate with various features to allow a user to interact with the device 200. Accordingly, the UI may include various human-to-machine interfaces, e.g., a keypad, keys, a keyboard, buttons, switches, knobs, a touchpad, a joystick, a pointing device, a virtual writing tablet, a touch screen, a microphone, or any device, component, or function that enables the user to select options, input information, or otherwise control the operation of the device 200. The UI may include one or more graphical user interface (GUI) control elements that enable a user to manipulate or otherwise interact with an application via the display element 214. The display element 214 and/or the device-specific items 206 may be utilized to generate, present, render, output, and/or annunciate alerts, alarms, messages, or notifications that are associated with operation of the medication delivery system 102, associated with a status or condition of the user, associated with operation, status, or condition of the system 100, etc.
  • The communication interface 210 facilitates data communication between the device 200 and other components as needed during the operation of the device 200. In the context of this description, the communication interface 210 can be employed to transmit or stream device-related control data, patient-related user status (e.g., gesture data or status data), device-related status or operational data, sensor data, calibration data, and the like. It should be appreciated that the particular configuration and functionality of the communication interface 210 can vary depending on the hardware platform and specific implementation of the device 200. In practice, an embodiment of the device 200 may support wireless data communication and/or wired data communication, using various data communication protocols. For example, the communication interface 210 could support one or more wireless data communication protocols, techniques, or methodologies, including, without limitation: RF; IrDA (infrared); Bluetooth; BLE; ZigBee (and other variants of the IEEE 802.15 protocol); IEEE 802.11 (any variation); IEEE 802.16 (WiMAX or any other variation); Direct Sequence Spread Spectrum; Frequency Hopping Spread Spectrum; cellular/wireless/cordless telecommunication protocols; wireless home network communication protocols; paging network protocols; magnetic induction; satellite data communication protocols; wireless hospital or health care facility network protocols such as those operating in the WMTS bands; GPRS; and proprietary wireless data communication protocols such as variants of Wireless USB. Moreover, the communication interface 210 could support one or more wired/cabled data communication protocols, including, without limitation: Ethernet; powerline; home network communication protocols; USB; IEEE 1394 (Firewire); hospital network communication protocols; and proprietary data communication protocols.
  • The at least one API 212 supports communication and interactions between software applications and logical components that are associated with operation of the device 200. For example, one or more APIs 212 may be configured to facilitate compatible communication and cooperation with the patient care application 110, and to facilitate receipt and processing of data from sources external to the device 200 (e.g., databases or remote devices and systems).
  • The display element 214 is suitably configured to enable the device 200 to render and display various screens, recommendation messages, alerts, alarms, notifications, GUIs, GUI control elements, drop down menus, auto-fill fields, text entry fields, message fields, or the like. Of course, the display element 214 may also be utilized for the display of other information during the operation of the device 200, as is well understood. Notably, the specific configuration, operating characteristics, size, resolution, and functionality of the display element 214 can vary depending upon the implementation of the device 200.
  • As mentioned above, the medication delivery system 102 is suitably configured and programmed to support an automatic mode to automatically control delivery of insulin to the user. In this regard, FIG. 7 is a simplified block diagram representation of a closed loop glucose control system 300 arranged in accordance with certain embodiments. The system 300 depicted in FIG. 7 functions to regulate the rate of fluid infusion into a body of a user based on feedback from an analyte concentration measurement taken from the body, along with other information (e.g., history of insulin delivered, exercise or activity indications). In particular embodiments, the system 300 is implemented as an automated control system for regulating the rate of insulin infusion into the body of a user based on a glucose concentration measurement taken from the body. The system 300 is designed to model the physiological response of the user to control an insulin infusion device 302 in an appropriate manner to release insulin 304 into the body 306 of the user in a similar concentration profile as would be created by fully functioning human β-cells when responding to changes in blood glucose concentrations in the body. Thus, the system 300 simulates the body's natural insulin response to blood glucose levels and not only makes efficient use of insulin, but also accounts for other bodily functions as well since insulin has both metabolic and mitogenic effects.
  • Certain embodiments of the system 300 include, without limitation: the insulin infusion device 302; a glucose sensor system 308 (e.g., the analyte sensor 112 shown in FIG. 1); and at least one controller 310, which may be incorporated in the insulin infusion device 302 as shown in FIG. 7. The glucose sensor system 308 generates a sensor signal 314 representative of blood glucose levels 316 in the body 306, and provides the sensor signal 314 to the at least one controller 310. The at least one controller 310 receives the sensor signal 314 and generates commands 320 that regulate the timing and dosage of insulin 304 delivered by the insulin infusion device 302. The commands 320 are generated in response to various factors, variables, settings, and control algorithms utilized by the insulin infusion device 302. For example, the commands 320 (and, therefore, the delivery of insulin 304) can be influenced by a target glucose setpoint value 322 that is maintained and regulated by the insulin infusion device 302. Moreover, the commands 320 (and, therefore, the delivery of insulin 304) can be influenced by stress-related information 324 obtained from one or more sources, e.g., the gesture-based physical behavior detection system 104 and/or one or more ancillary systems 106 (see FIG. 1).
  • Generally, the glucose sensor system 308 includes a continuous glucose sensor, sensor electrical components to provide power to the sensor and generate the sensor signal 314, a sensor communication system to carry the sensor signal 314 to the at least one controller 310, and a sensor system housing for the electrical components and the sensor communication system. As mentioned above with reference to FIG. 6, the glucose sensor system 308 may be implemented as a computer-based or processor-based component having the described configuration and features.
  • Typically, the at least one controller 310 includes controller electrical components and software to generate commands for the insulin infusion device 302 based on the sensor signal 314, the target glucose setpoint value 322, the stress-related information 324, and other user-specific parameters, settings, and factors. The at least one controller 310 may include a controller communication system to receive the sensor signal 314 and issue the commands 320.
  • Generally, the insulin infusion device 302 includes a fluid pump mechanism 328, a fluid reservoir 330 for the medication (e.g., insulin), and an infusion tube to infuse the insulin 304 into the body 306. In certain embodiments, the insulin infusion device 302 includes an infusion communication system to handle the commands 320 from the at least one controller 310, electrical components and programmed logic to activate the fluid pump mechanism 328 motor according to the commands 320, and a housing to hold the components of the insulin infusion device 302. Accordingly, the fluid pump mechanism 328 receives the commands 320 and delivers the insulin 304 from the fluid reservoir 330 to the body 306 in accordance with the commands 320. It should be appreciated that an embodiment of the insulin infusion device 302 can include additional elements, components, and features that may provide conventional functionality that need not be described herein. Moreover, an embodiment of the insulin infusion device 302 can include alternative elements, components, and features if so desired, as long as the intended and described functionality remains in place. In this regard, as mentioned above with reference to FIG. 6, the insulin infusion device 302 may be implemented as a computer-based or processor-based components having the described configuration and features, including the display element 214 or other device-specific items 206 as described above.
  • The at least one controller 310 is configured and programmed to regulate the operation of the fluid pump mechanism 328 and other functions of the insulin infusion device 302. The at least one controller 310 controls the fluid pump mechanism 328 to deliver the fluid medication (e.g., insulin) from the fluid reservoir 330 to the body 306. As mentioned above, the at least one controller 310 can be housed in the infusion device housing, wherein the infusion communication system is an electrical trace or a wire that carries the commands 320 from the at least one controller 310 to the fluid pump mechanism 328. In alternative embodiments, the at least one controller 310 can be housed in the sensor system housing, wherein the sensor communication system is an electrical trace or a wire that carries the sensor signal 314 from the sensor electrical components to the at least one controller 310. In accordance with some embodiments, the at least one controller 310 has its own housing or is included in a supplemental or ancillary device. In other embodiments, the at least one controller 310, the insulin infusion device 302, and the glucose sensor system 308 are all located within one common housing.
  • Referring again to FIG. 1, the gesture-based physical behavior detection system 104 employs at least one sensor to obtain corresponding user-specific sensor data. The obtained user-specific sensor data is processed or analyzed by the gesture-based physical behavior detection system 104 and/or by another suitably configured device or component of the system 100 to determine whether the user is under stress (and, if so, how stressed). The obtained user-specific sensor data may also be processed or analyzed to obtain certain stress-related parameters, characteristics, and/or metadata for the user. For example, the obtained user-specific sensor data may identify, include, or indicate any or all of the following, without limitation: timestamp data corresponding to periods of stress; a type, category, or classification of the physical behavior or activity that is contemporaneous with periods of stress; location data; user posture or position information; etc.
  • The gesture-based physical behavior detection system 104 may include, cooperate with, or be realized as a motion-based physical behavior detection system, an activity-based physical behavior detection system, an image or video based activity detection system, or the like. In certain embodiments, the system 104 may be realized as a unitary “self-contained” wearable system that communicates with one or more other components of the system 100. For example, the system 104 can be implemented with at least one wearable device such as an activity monitor device, a smart watch device, a smart bracelet device, or the like. In some embodiments, the system 104 may be realized as at least one portable or wearable device that includes or communicates with one or more external or ancillary sensor devices, units, or components. For example, the system 104 can be implemented with a wearable or portable smart device that is linked with one or more external sensors worn or carried by the user. These and other possible deployments of the system 104 are contemplated by this disclosure. In this regard, United States patent publication number US 2020/0135320 and United States patent publication number US 2020/0289373 disclose gesture-based physical behavior detection systems that are suitable for use as the system 104; the entire content of these United States patent documents is incorporated by reference herein.
  • FIG. 8 is a block diagram representation of a gesture-based physical behavior detection system 400 arranged in accordance with certain embodiments. The system 400 is suitable for use with the system 100 shown FIG. 1. In certain embodiments, the system 400 is deployed as a wearable electronic device in the form factor of a bracelet or wristband that is worn around the wrist or arm of a user's dominant hand. The system 400 may optionally be implemented using a modular design, wherein individual modules include one or more subsets of the disclosed components and overall functionality. The user may choose to add specific modules based on personal preferences and requirements.
  • The system 400 includes a battery 402 and a power management unit (PMU) 404 to deliver power at the proper supply voltage levels to all electronic circuits and components. The PMU 404 may also include battery-recharging circuitry. The PMU 404 may also include hardware, such as switches, that allows power to specific electronics circuits and components to be cut off when not in use.
  • When there is no movement-based or gesture-based behavior event in progress, most circuitry and components in the system 400 are switched off to conserve power. Only circuitry and components that are required to detect or help predict the start of a behavior event of interest may remain enabled. For example, if no motion is being detected, all sensor circuits but an accelerometer 406 may be switched off and the accelerometer 406 may be put in a low-power wake-on-motion mode or in another lower power mode that consumes less power and uses less processing resources than its high performance active mode. A controller 408 of the system 400 may also be placed into a low-power mode to conserve power. When motion or a certain motion pattern is detected, the accelerometer 406 and/or the controller 408 may switch into a higher power mode and additional sensors such as, for example, a gyroscope 410 and/or a proximity sensor 412 may also be enabled. When a potential start of a movement-based or gesture-based event is detected, memory variables for storing event-specific parameters, such as gesture types, gesture duration, etc. can be initialized.
  • In another example, upon detection of user motion, the accelerometer 406 switches into a higher power mode, but other sensors remain switched off until the data from the accelerometer 406 indicates that the start of a behavior event has likely occurred. At that point in time, additional sensors such as the gyroscope 410 and the proximity sensor 412 may be enabled.
  • In another example, when there is no behavior event in progress, both the accelerometer 406 and gyroscope 410 are enabled but at least one of either the accelerometer 406 or the gyroscope 410 is placed in a lower power mode compared to their regular power mode. For example, the sampling rate may be reduced to conserve power. Similarly, the circuitry required to transfer data from the system 400 to a destination device may be placed in a lower power mode. For example, radio circuitry 414 could be disabled. Similarly, the circuitry required to transfer data from the system 400 may be placed in a lower power mode. For example, the radio circuitry 414 could be disabled until a possible or likely start of a behavior event has been determined. Alternatively, it may remain enabled but in a low power state to maintain the connection between the system 400 and one or more other components of the system 100, but without transferring user status data, sensor data, or the like.
  • In yet another example, all motion-detection related circuitry may be switched off if, based on certain metadata, it is determined that the occurrence of a particular behavior event, such as a food intake event, is unlikely. This may be desirable to further conserve power. Metadata used to make this determination may, among other things, include one or more of the following: time of the day, location, ambient light levels, proximity sensing, and detection that the system 400 has been removed from the wrist or hand, detection that the system 400 is being charged, or the like. Metadata may be generated and collected by the system 400. Alternatively, metadata may be collected by another device that is external to the system 400 and is configured to directly or indirectly exchange information with the system 400. It is also possible that some metadata is generated and collected by the system 400, while other metadata is generated and collected by a device that is external to the system 400. In case some or all of the metadata is generated and collected external to the system 400, the system 400 may periodically or from time to time power up its radio circuitry 414 to retrieve metadata related information from another device.
  • In certain embodiments, some or all of the sensors may be turned on or placed in a higher power mode if certain metadata indicates that the occurrence of a particular behavior event, such as the user beginning to work, jog, or eat, is likely. Metadata used to make this determination may, among other things, include one or more of the following: time of the day; location; ambient light levels; proximity sensing; historical user behavior patterns. Some or all of the metadata may be collected by the system 400 or by an ancillary device that cooperates or communicates with the system 400, as mentioned above.
  • User status data used to track certain aspects of a user's behavior may be stored locally inside memory 416 of the system 400 and processed locally using the controller 408 of the system 400. User status data may also be transferred to the medication delivery system 102, the patient care application 110, and/or one or more of the database 114 mentioned above with reference to FIG. 1 (such that the user status data can be processed, analyzed, or otherwise utilized by the applications or components that receive the user status data). It is also possible that some of the processing and analysis are performed locally by the system 400, while further processing and analysis are performed by one or more other components of the system 100.
  • The detection of the start of a behavior event, such as the start of a work activity, may trigger the power up and/or activation of additional sensors and circuitry, such as a camera 418. Power up and/or activation of additional sensors and circuitry may occur at the same time as the detection of the behavior event of interest or some time thereafter. Specific sensors and circuitry may be turned on only at specific times during a detected event, and may be switched off otherwise to conserve power. It is also possible that the camera 418 only gets powered up or activated upon explicit user intervention such as, for example, pushing and holding a button 420. Releasing the button 420 may turn off the camera 418 to conserve power.
  • When the camera 418 is powered up, a projecting light source 422 may also be enabled to provide visual feedback to the user about the area that is within view of the camera or to otherwise illuminate the field of view. Alternatively, the projecting light source 422 may only be activated sometime after the camera 418 has been activated. In certain cases, additional conditions may need to be met before the projecting light source 422 is activated. Such conditions may include: the determination that the projecting light source 422 is likely aiming in the direction of the object of interest; the determination that the system 400 is not moving excessively; or the like. In some embodiments, one or more light emitting diodes (LEDs) 426 may be used as the projecting light source 422.
  • Images may be tagged with additional information or metadata such as: camera focal information; proximity information from the proximity sensor 412; ambient light levels information from an ambient light sensor 424; timestamp information; etc. Such additional information or metadata may be used during the processing and analysis of the user status data.
  • The projecting light source 422 may also be used to communicate other information. As an example, an ancillary device may use inputs from one or more proximity sensors 412, process those inputs to determine if the camera 418 is within the proper distance range from the object of interest, and use one or more light sources to communicate that the camera is within the proper distance range, that the user needs to increase the distance between camera and the object of interest, or that the user needs to reduce the distance between the camera and the object of interest.
  • The projecting light source 422 may also be used in combination with the ambient light sensor 424 to communicate to the user if the ambient light is insufficient or too strong for an adequate quality image capture. The projecting light source 422 may also be used to communicate information including, but not limited to, a low battery situation or a functional defect.
  • The projecting light source 422 may also be used to communicate dietary coaching information. As an example, the projecting light source 422 might, among other things, indicate if not enough or too much time has expired since a previous food intake event, or may communicate to the user how he/she is doing against specific dietary goals.
  • Signaling mechanisms to convey specific messages using one or more projecting light sources 422 may include, but are not limited to, one or more of the following: specific light intensities or light intensity patterns; specific light colors or light color patterns; specific spatial or temporal light patterns. Multiple mechanisms may also be combined to signal one specific message.
  • A microphone 428 may be used by the user to add specific or custom labels or messages to a detected event and/or image. In certain embodiments, audio captured by the microphone 428 can be processed to assist in the determination of whether the user is eating or drinking. Audio snippets may be processed by a voice recognition engine.
  • In certain embodiments, the accelerometer 406 (possibly combined with other sensors, including other inertial sensors) may, in addition to tracking at least one parameter that is directly related to a gesture-based behavior event, also be used to track one or more parameters that are not directly related to that particular event. Such parameters may, among other things, include physical activity, sleep, stress, or illness.
  • In addition to the particular sensors, detectors, and components mentioned above, the system 400 may include or cooperate with any number of other sensors 430 as appropriate for the particular embodiment. For example, and without limitation, the system 400 may include or cooperate with any or all of the following: a heartrate monitor; a physiological characteristic or analyte sensor; a continuous glucose monitor; a GPS receiver; and any other sensor, monitor, or detector mentioned elsewhere herein. The system 400 obtains user status data from one or more of its sensors, detectors, and sources, wherein the user status data indicates a stressful activity of the user. The user status data can be analyzed and processed by the system 400 (and/or by one or more other components of the system 100) to determine whether the user is under stress, in certain embodiments, to determine additional information, characteristics, or metrics related to the user's stress level, stressful conditions, etc. In certain embodiments, the system 400 and/or an ancillary system 106 or device determines the user's stress activity status primarily based on the output of user-worn motion sensors, movement sensors, one or more inertial sensors (e.g., one or more accelerometers and/or one or more gyroscopes), one or more GPS sensors, one or more magnetometers, one or more force or physical pressure sensors, or the like, which are suitably configured, positioned, and arranged to measure physical movement or motion of the user's limbs, digits, joints, facial features, head, and/or other body parts.
  • In some embodiments, the system 400 includes at least one haptic interface 440 that is suitably configured and operated to provide haptic feedback as an output. The at least one haptic interface 440 generates output(s) that can be experienced by the sense of touch by the user, e.g., mechanical force, vibration, movement, temperature changes, or the like. Haptic feedback generated by the at least one haptic interface 440 may represent or be associated with one or more of the following, without limitation: reminders; alerts; confirmations; notifications; messages; numerical values (such as measurements); status indicators; or any other type of output provided by the system 400.
  • In certain embodiments, the user status data (e.g., sensor data) is provided to a gesture recognizer unit or processor. To this end, sensor data may be sent in raw format. Alternatively, a source of sensor data may perform some processing (e.g., filtering, compression, or formatting) on raw sensor data before sending the processed sensor data to the gesture recognizer unit. The gesture recognizer unit analyzes the incoming sensor data and converts the incoming sensor data into a stream of corresponding gestures, which may be predetermined or otherwise classified or categorized. The gesture recognizer unit may use one or more ancillary inputs (such as the output from one or more ancillary systems 106) to aid in the gesture determination process. Nonlimiting examples of an ancillary input include: time of day; the probability of a specific gesture occurring based on statistical analysis of historical gesture data for that user; geographical location; heart rate; and/or other physiological sensor inputs. Other ancillary inputs are also possible.
  • The output of the gesture recognizer unit—the detected gestures—can be sent to an event detector or processor. The event detector analyzes the incoming stream of gestures to determine if the start of an event of interest (e.g., a period of stress or stressful activity) has occurred, whether an event is ongoing, whether an event has ended, or the like. For example, if the event detector is implemented as a stress detector intended to capture stressful events, it will be suitably configured to determine if the start of a stressful period has occurred, if the user is still under stress, or if the user has stopped the stressful activity. Although this description focuses on stress detection, the gesture-based physical behavior detection system 400 may be suitably configured to monitor other types of physical behavior or activities. Such activities include, without limitation: eating; reading; sleeping; smoking; getting dressed; turning down a bed; making a bed; brushing teeth; combing hair; talking on the phone; inhaling or injecting a medication; and activities related to hand hygiene or personal hygiene.
  • Referring again to FIG. 1, certain functions, features, and/or therapy related operations of the medication delivery device 102 can be adjusted or modified in response to the output of the gesture-based physical behavior detection system 104 and/or the output of at least one ancillary system 106. More specifically, operation of the medication delivery device 102 can be controlled or regulated based on a determination that the user is experiencing stress. For example, if the system 100 determines that the user is under stress, then the therapy control algorithm of the medication delivery device 102 can be adjusted or replaced with a stress-correlated therapy control algorithm that compensates for the detected stress.
  • FIG. 9 is a flow chart that illustrates an automated method of operating a medication delivery system. The example described here represents a control process 500 for a device that delivers insulin to a user, such as the insulin infusion device 302 shown in FIG. 7. This description assumes that the insulin infusion device is operating in a first mode of operation to automatically deliver the insulin medication to the user in accordance with a therapy control algorithm (task 502). As explained above with reference to FIG. 7, insulin delivery is controlled and regulated in accordance with various factors, sensor data, user-specific settings, a target glucose setpoint value, etc. Moreover, the baseline therapy control algorithm may include, utilize, or be defined by certain parameters, constants, thresholds, variables, limits, or the like, some of which may be user-specific, and some of which may be adjustable or dynamic in nature to alter the aggressiveness of insulin therapy. In this regard, a nominal or default therapy control algorithm can be used when the user is relatively stress free, under little to no stress, or within a threshold amount from an average stress level for that user. In contrast, the default therapy control algorithm can be adjusted or modified, or a different stress-correlated therapy control algorithm can be utilized, when the user is experiencing stress.
  • The process 500 receives stress-identifying data that indicates a current stress status of the user (task 504) and analyzes or processes at least some of the received stress-identifying data (e.g., the stress-related information 324 of FIG. 7) to determine whether the user is under stress while the insulin infusion device is operating in the first mode (query task 506). As mentioned above, the stress-identifying data is generated by sensors, detector units, or other sources of data that are included with or associated with a suitably configured gesture-based physical behavior detection system 400 (e.g., the accelerometer 406, the gyroscope 410, the proximity sensor 412, one or more other sensors 430, the microphone 428, and/or the camera 418). Accordingly, the stress-identifying data may be generated at least in part from gesture data obtained for the user. Depending on the particular embodiment, at least some of the stress-identifying data may include user status data generated or provided by at least one ancillary system 106 or device (other than the gesture-physical behavior detection system 400) that monitors certain characteristics, status or condition of the user. Accordingly, the stress-identifying data may be generated at least in part from such user status data.
  • The process 500 continues by analyzing or processing at least some of the received stress-identifying data to determine whether the user is under stress (query task 506). If analysis of the stress-identifying data indicates that the user is not under stress or that the user's stress level is below a certain threshold (the “No” branch of query task 506), then the process 500 continues to operate the insulin infusion device in the first mode, using the same therapy control algorithm. If, however, the stress-identifying data indicates that the user is under stress while the insulin infusion device is operating in the first mode (the “Yes” branch of query task 506), then operation of the insulin infusion device is adjusted or changed in a stress-correlated manner to compensate for the detected level of stress, the type of stress, the duration of detected stress, etc. In certain embodiments, the process 500 changes at least one therapy-altering factor of the currently active therapy control algorithm to obtain an appropriate stress-correlated therapy control algorithm that compensates for the detected stress condition (task 508). In accordance with some embodiments, the process 500 accesses, retrieves, or selects an appropriate stress-correlated therapy control algorithm that compensates for the detected stress condition (task 510). Regardless of which technique is utilized, changing the existing therapy control algorithm or selecting a new therapy control algorithm may be a function of a stress level of the user, as determined from the stress-identifying data.
  • The process 500 continues by operating the insulin infusion device in a second mode of operation (e.g., under the changed therapy control algorithm or the new therapy control algorithm) to automatically deliver the insulin medication to the user in accordance with the stress-correlated therapy control algorithm (task 512). As mentioned above, the second mode of operation compensates for user stress as determined from the stress-identifying data. This example assumes that the transition from the first mode of operation to the second mode of operation occurs automatically, and without any user input or involvement. In some embodiments, however, the process 500 may require a user confirmation before transitioning to the second mode of operation. In some implementations, a remote data processing system (e.g., a cloud-based system such as the data processing system 116 shown in FIG. 1) receives and processes the stress-identifying data to determine whether the user is under stress and, if so, sends at least one command, instruction, or control signal to the medication delivery system 102. The at least one command, instruction, or control signal causes the medication delivery system 102 to transition from the first mode of operation to the second mode of operation.
  • FIG. 9 shows the process 500 leading back to task 504 to receive updated or additional stress-identifying data for continued monitoring and processing during the second mode of operation. If the process 500 determines (from the updated stress-identifying data) that the user is no longer experiencing stress (the “No” branch of query task 506), then the baseline therapy delivery algorithm or any suitable algorithm can be utilized to transition away from the second mode of operation and return to the first mode of operation (task 502). Thereafter, the process 500 may continue as described above.
  • As mentioned above, the process 500 (at query task 506) determines whether the user is under stress, based on the received stress-identifying data. The stress-identifying data may include, for example, any of the following: raw (uncharacterized or unprocessed) or processed sensor data generated by the gesture-based physical behavior detection system 104, 400; gesture data provided by the system 104, 400; raw (uncharacterized or unprocessed) or processed sensor data generated by one or more ancillary systems 106; and raw (uncharacterized or unprocessed) or processed sensor data generated by the analyte sensor 112. In certain embodiments, the device or system that makes the stress determination has already been trained with historical data such that it can compare the received stress-identifying data against historical data, trends, patterns, and/or conditions that are known to be correlated with user stress. For example, if the received stress-identifying data includes gesture data that indicates physical behavior or stress-indicating gestures that historically correspond to work-related activity (e.g., typing, operating heavy machinery, delivering packages, public speaking), calendar data that historically indicates typical working hours, and GPS location data that historically indicates a typical work location, then the process 500 can determine and declare that the user is likely under stress.
  • As another example, if the received stress-identifying data includes gesture data that indicates physical behavior historically corresponding to commuting activity (e.g., driving) and ancillary user status data that indicates physiological stress (e.g., high pulse rate, high blood pressure, sweating, high body temperature), then the process 500 can determine and declare that the user is likely under stress (the “Yes” branch of query task 506). As another example, if the received stress-identifying data includes gesture data that indicates physical behavior historically corresponding to studying, doing homework, taking an exam, or attending class (school-related activity) and ancillary user status data that indicates conditions normally associated with school-related activity (e.g., calendar information, location data, time of day), then the process 500 can determine and declare that the user is likely under stress (the “Yes” branch of query task 506).
  • Accordingly, the process 500 may declare that the user is currently under stress, identify times during which the user is under stress, indicate the type of stress (e.g., work-related stress, school-related stress, mental stress, physical stress, illness-related stress, or the like), quantify or classify a current stress level or severity, or the like, as appropriate for the particular embodiment and application of the system 100.
  • As mentioned above with reference to task 512, the insulin infusion device transitions to the second mode of operation to deliver insulin in accordance with the stress-correlated therapy control algorithm. In certain embodiments, the stress-correlated therapy control algorithm increases the aggressiveness of the insulin therapy provided by the insulin infusion device, relative to the baseline therapy control algorithm that is utilized for the first mode of operation. Increasing the aggressiveness is desirable to counteract any stress-induced increase in blood glucose. Thus, one or more settings, parameters, or variables can be adjusted based on stress detection. For example, the user's target glucose setpoint value can be adjusted, or controller gain values (which are utilized by the automatic insulin delivery control algorithm) can be adjusted as a function of stress detection and/or certain stress-related characteristics.
  • In certain embodiments, the controller of the insulin infusion device employs a proportional-integral-derivative insulin feedback (PID-IFB) control algorithm designed for continuous closed-loop insulin delivery control. Some implementations of the PID-IFB control algorithm include PID gain values that are applied to an error term, a time derivative of sensor glucose term, and an integral error term (which is the integral action on historical errors between sensor glucose readings and the controller setpoint, such as 100 mg/dL). Moreover, certain implementations of the PID-IFB control algorithm calculate the IFB using time constants that can be adjusted based on stress detection or observed/measured stress characteristics. In addition, certain implementations of the PID-IFB control algorithm employ a maximum insulin limit (referred to as “Umax”) that governs the insulin dosage output of the control algorithm—Umax can also be adjusted based on stress detection or observed/measured stress characteristics. In this regard, the controller gain values, Umax, and/or time constants can be regulated to make the controller more or less responsive to changes in sensor glucose measurements during periods of user stress. It should be appreciated that insulin therapy can be changed in other ways based on detected user stress, and that the examples provided here are neither exhaustive nor limiting.
  • In certain implementations, aggressiveness of the insulin therapy provided by the insulin infusion device is correlated with (e.g., proportional to) a measure or level of stress that is detected by the process 500. Accordingly, the control algorithm and/or therapy delivery actions can be adjusted, controlled, or regulated in a different manner using any desired methodology that is driven by the detected stress level. For example, increased aggressiveness may be associated with enabling automatic basal insulin delivery by the insulin infusion device, temporarily increasing the basal rate of insulin, enabling an automatic bolus delivery feature, administering or recommending an additional insulin bolus, lowering the target glucose setpoint value, temporarily increasing insulin limits that govern the delivery of insulin to the user, or the like. In contrast, less aggressive insulin therapy may be associated with capped or limited insulin boluses, using an upper limit on a current sensor glucose value for purposes of calculating and administering an automatic bolus, or temporarily using a modified basal delivery profile (e.g., a reduced or flat profile) for the delivery of insulin.
  • FIG. 10 is a flow chart that illustrates a training process 600 according to certain embodiments. As mentioned above, the system 100 can be initialized or trained with historical data for purposes of determining whether the user is under stress, based on obtained gesture data and/or ancillary user status data. Accordingly, the process 600 can be employed with certain embodiments to train the stress detection feature. It should be appreciated that other methodologies, including those that need not employ “training” per se, can be utilized in an implementation of the system 100.
  • The process 600 obtains gesture training data, which is provided by the gesture-based physical behavior detection system 104, 400 during one or more training sessions or periods of time (task 602). Alternatively or additionally, the process 600 obtains ancillary user status training data (e.g., location, date/time, heart rate, blood pressure, sweat, temperature, physical activity, exercise, or any other type of ancillary data 106 described above with reference to FIG. 1), which is provided by one or more ancillary systems 106 during one or more training sessions or periods of time (task 604). The process 600 also obtains stress marker data, which may be entered by the user, during the training sessions or periods of time (task 606). The stress marker data can be obtained in response to the user interacting with one or more user devices 108 to record, flag, mark, or otherwise identify points in time or periods of time at which the user is under stress. The stress marker data may also include information that characterizes or describes the type of stress, the severity or magnitude of the stress (e.g., low, average, or high stress level), or other metadata related to the recorded stress. The process 600 may continue by temporally correlating the obtained training data (e.g., gesture training data and/or ancillary user training data) with the obtained stress marker data (task 608). The temporal correlation can be utilized to identify and record certain stress-indicating gestures performed by the user while under stress and/or to identify and record user status information obtained during the marked period of user stress (task 610), along with corresponding time/date stamp data. In this regard, different combinations of training data can be used to identify or classify a stress event during training. For example, if the user's heart rate, sweat level, and body temperature exceed certain threshold values and other data (such as calendar information, location information, or date/time information) indicate a potentially stressful condition, then the process 600 can define the particular combination of data as a stress event. Accordingly, different combinations of physiological data, gesture data, and non-physiological data, and applicable thresholds or measurement ranges, can be analyzed for purposes of characterizing and defining stress-related events for a particular user.
  • The training process 600 may continue by correlating user stress levels and/or periods of stress with historical user data related to physiological condition, measured analyte levels (e.g., obtained from the analyte sensor 112), therapy outcomes, fitness or activity logs, or the like (task 612). In this regard, the system 100 can be trained in a way that links detectable periods of user stress to the operation and control of the medication delivery system 102. Accordingly, the process 600 may generate and save one or more stress-correlated therapy control algorithms, settings, device configurations, or the like (task 614). As described above, a stress-correlated therapy control algorithm can be used going forward (after the training period ends) in response to the automatic detection of user stress.
  • The various tasks performed in connection with a process described herein may be performed by software, hardware, firmware, or any combination thereof. It should be appreciated that a described process may include any number of additional or alternative tasks, the tasks shown in a flow chart representation need not be performed in the illustrated order, and that a described process may be incorporated into a more comprehensive procedure or process having additional functionality not described in detail herein. Moreover, one or more of the illustrated tasks could be omitted from an embodiment of the described process as long as the intended overall functionality remains intact.
  • While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or embodiments described herein are not intended to limit the scope, applicability, or configuration of the claimed subject matter in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the described embodiment or embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope defined by the claims, which includes known equivalents and foreseeable equivalents at the time of filing this patent application.

Claims (20)

What is claimed is:
1. A method of operating a medication delivery system comprising a fluid pump mechanism and at least one controller that regulates operation of the fluid pump mechanism to deliver medication from the medication delivery system, the method comprising:
operating the medication delivery system in a first mode of operation to automatically deliver the medication to a user in accordance with a therapy control algorithm;
receiving stress-identifying data that indicates a current stress status of the user, the stress-identifying data generated at least in part from gesture data for the user, the gesture data provided by a gesture-based physical behavior detection system;
determining, from the stress-identifying data, that the user is under stress while the medication delivery system is operating in the first mode; and
in response to the determining, operating the medication delivery system in a second mode of operation to automatically deliver the medication to the user in accordance with a stress-correlated therapy control algorithm, wherein the second mode of operation compensates for user stress as determined from the stress-identifying data.
2. The method of claim 1, wherein a transition from the first mode of operation to the second mode of operation occurs automatically without user input.
3. The method of claim 1, further comprising:
changing at least one therapy-altering factor of the therapy control algorithm, based on the determining, to obtain the stress-correlated therapy control algorithm.
4. The method of claim 3, wherein the changing is a function of a stress level of the user, as determined from the stress-identifying data.
5. The method of claim 1, wherein:
the receiving and determining steps are performed by a data processing system that communicates with the medication delivery system; and
the data processing system sends at least one command to the medication delivery system, the at least one command causing the medication delivery system to transition from the first mode of operation to the second mode of operation.
6. The method of claim 1, wherein the received stress-identifying data comprises user status data for the user, the user status data generated by at least one ancillary system that monitors the user.
7. The method of claim 1, further comprising:
obtaining gesture training data provided by the gesture-based physical behavior detection system, and stress marker data entered by the user; and
temporally correlating the obtained gesture training data with the obtained stress marker data to identify stress-indicating gestures performed by the user while under stress, wherein the stress-identifying data is generated in response to comparing the gesture data against the identified stress-indicating gestures.
8. The method of claim 1, wherein:
the medication comprises insulin; and
the stress-correlated therapy control algorithm utilized for the second mode of operation increases aggressiveness of insulin therapy provided by the medication delivery system, relative to the therapy control algorithm utilized for the first mode of operation.
9. A medication delivery system comprising:
a fluid pump mechanism;
at least one controller that regulates operation of the fluid pump mechanism to deliver insulin from the medication delivery system; and
at least one memory element associated with the at least one controller, the at least one memory element storing processor-executable instructions configurable to be executed by the at least one controller to perform a method of controlling operation of the medication delivery system, the method comprising:
operating the medication delivery system in a first mode of operation to automatically deliver the insulin to a user in accordance with a therapy control algorithm;
receiving stress-identifying data that indicates a current stress status of the user, the stress-identifying data generated at least in part from gesture data for the user, the gesture data provided by a gesture-based physical behavior detection system;
determining, from the stress-identifying data, that the user is under stress while the medication delivery system is operating in the first mode; and
in response to the determining, operating the medication delivery system in a second mode of operation to automatically deliver the insulin to the user in accordance with a stress-correlated therapy control algorithm, wherein the second mode of operation compensates for user stress as determined from the stress-identifying data.
10. The medication delivery system of claim 9, wherein a transition from the first mode of operation to the second mode of operation occurs automatically without user input.
11. The medication delivery system of claim 9, wherein the method performed by the at least one controller further comprises:
changing at least one therapy-altering factor of the therapy control algorithm, based on the determining, to obtain the stress-correlated therapy control algorithm.
12. The medication delivery system of claim 9, wherein the gesture-based physical behavior detection system is separate and distinct from the medication delivery system.
13. The medication delivery system of claim 9, wherein the medication delivery system comprises the gesture-based physical behavior detection system.
14. The medication delivery system of claim 9, wherein the received stress-identifying data comprises user status data for the user, the user status data generated by at least one ancillary system that monitors the user.
15. The medication delivery system of claim 9, wherein the method performed by the at least one controller further comprises:
obtaining gesture training data provided by the gesture-based physical behavior detection system, and stress marker data entered by the user; and
temporally correlating the obtained gesture training data with the obtained stress marker data to identify stress-indicating gestures performed by the user while under stress, wherein the stress-identifying data is generated in response to comparing the gesture data against the identified stress-indicating gestures.
16. The medication delivery system of claim 9, wherein the stress-correlated therapy control algorithm utilized for the second mode of operation increases aggressiveness of insulin therapy provided by the medication delivery system, relative to the therapy control algorithm utilized for the first mode of operation.
17. A system comprising:
an insulin infusion device that regulates delivery of insulin to a user;
a gesture-based physical behavior detection system configured to generate gesture data for the user, and configured to communicate the gesture data; and
at least one controller that controls operation of the insulin infusion device, the at least one controller configured to:
operate the insulin infusion device in a first mode of operation to automatically deliver the insulin to the user in accordance with a therapy control algorithm;
process stress-identifying data that indicates a current stress status of the user, the stress-identifying data generated at least in part from gesture data provided by the gesture-based physical behavior detection system;
determine, from the stress-identifying data, that the user is under stress while the insulin infusion device is operating in the first mode; and
in response to determining that the user is under stress, operate the insulin infusion device in a second mode of operation to automatically deliver the insulin to the user in accordance with a stress-correlated therapy control algorithm, wherein the second mode of operation compensates for user stress as determined from the stress-identifying data.
18. The system of claim 17, wherein the insulin infusion device comprises the at least one controller.
19. The system of claim 17, wherein the stress-correlated therapy control algorithm utilized for the second mode of operation increases aggressiveness of insulin therapy provided by the insulin infusion device, relative to the therapy control algorithm utilized for the first mode of operation.
20. The system of claim 17, wherein the stress-identifying data comprises user status data for the user, the user status data generated by at least one ancillary system that monitors characteristics, status, or condition of the user.
US17/118,105 2019-12-13 2020-12-10 Controlling medication delivery system operation and features based on automatically detected user stress level Abandoned US20210178063A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/118,105 US20210178063A1 (en) 2019-12-13 2020-12-10 Controlling medication delivery system operation and features based on automatically detected user stress level

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962948015P 2019-12-13 2019-12-13
US17/118,105 US20210178063A1 (en) 2019-12-13 2020-12-10 Controlling medication delivery system operation and features based on automatically detected user stress level

Publications (1)

Publication Number Publication Date
US20210178063A1 true US20210178063A1 (en) 2021-06-17

Family

ID=76316366

Family Applications (3)

Application Number Title Priority Date Filing Date
US17/118,105 Abandoned US20210178063A1 (en) 2019-12-13 2020-12-10 Controlling medication delivery system operation and features based on automatically detected user stress level
US17/119,007 Active 2041-06-24 US12369819B2 (en) 2019-12-13 2020-12-11 Multi-sensor gesture-based operation of a medication delivery system
US19/248,138 Pending US20250318751A1 (en) 2019-12-13 2025-06-24 Multi-sensor gesture-based operation of a medication delivery system

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/119,007 Active 2041-06-24 US12369819B2 (en) 2019-12-13 2020-12-11 Multi-sensor gesture-based operation of a medication delivery system
US19/248,138 Pending US20250318751A1 (en) 2019-12-13 2025-06-24 Multi-sensor gesture-based operation of a medication delivery system

Country Status (1)

Country Link
US (3) US20210178063A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220143315A1 (en) * 2020-11-10 2022-05-12 Baxter International Inc. System and method for extending the storage duration of a rechargeable battery of an infusion pump
US11488700B2 (en) * 2019-12-13 2022-11-01 Medtronic Minimed, Inc. Medical device configuration procedure guidance responsive to detected gestures
US20230013632A1 (en) * 2016-05-02 2023-01-19 Dexcom, Inc. System and method for providing alerts optimized for a user
US20230071908A1 (en) * 2020-02-10 2023-03-09 Prevayl Innovations Limited Wearable article
US20240196336A1 (en) * 2022-12-12 2024-06-13 Dexcom, Inc. Variable power transmission for battery-powered devices
US12369819B2 (en) 2019-12-13 2025-07-29 Medtronic Minimed, Inc. Multi-sensor gesture-based operation of a medication delivery system

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118885066A (en) 2019-11-08 2024-11-01 苹果公司 Machine learning based gesture recognition using multiple sensors
IT202100026348A1 (en) * 2021-10-14 2023-04-14 Univ Campus Bio Medico Di Roma DEVICE FOR SUBCUTANEOUS ADMINISTRATION
EP4426186A1 (en) * 2021-11-03 2024-09-11 Sanofi User authentication for a drug delivery device
US20250110570A1 (en) * 2023-09-29 2025-04-03 Apple Inc. Commands using secondary device gestures

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170091259A1 (en) * 2015-09-29 2017-03-30 Ascensia Diabetes Care Holdings Ag Methods and apparatus to reduce the impact of user-entered data errors in diabetes management systems
US20170220772A1 (en) * 2016-01-28 2017-08-03 Savor Labs, Inc. Method and apparatus for tracking of food intake and other behaviors and providing relevant feedback
US20180174675A1 (en) * 2016-12-21 2018-06-21 Medtronic Minimed, Inc. Infusion systems and methods for prospective closed-loop adjustments
US20180185578A1 (en) * 2014-12-04 2018-07-05 Medtronic Minimed, Inc. Methods for operating mode transitions and related infusion devices and systems
US20200104039A1 (en) * 2018-09-28 2020-04-02 Snap Inc. Neural network system for gesture, wear, activity, or carry detection on a wearable or mobile device
US20210100951A1 (en) * 2019-10-04 2021-04-08 Arnold Chase Controller based on lifestyle event detection

Family Cites Families (135)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5338157B1 (en) 1992-09-09 1999-11-02 Sims Deltec Inc Systems and methods for communicating with ambulat
US4685903A (en) 1984-01-06 1987-08-11 Pacesetter Infusion, Ltd. External infusion pump apparatus
US4562751A (en) 1984-01-06 1986-01-07 Nason Clyde K Solenoid drive apparatus for an external infusion pump
US4755173A (en) 1986-02-25 1988-07-05 Pacesetter Infusion, Ltd. Soft cannula subcutaneous injection set
US5097122A (en) 1990-04-16 1992-03-17 Pacesetter Infusion, Ltd. Medication infusion system having optical motion sensor to detect drive mechanism malfunction
US5080653A (en) 1990-04-16 1992-01-14 Pacesetter Infusion, Ltd. Infusion pump with dual position syringe locator
US5956501A (en) 1997-01-10 1999-09-21 Health Hero Network, Inc. Disease simulation system and method
US5307263A (en) 1992-11-17 1994-04-26 Raya Systems, Inc. Modular microprocessor-based health monitoring system
US5832448A (en) 1996-10-16 1998-11-03 Health Hero Network Multiple patient monitoring system for proactive health management
US5545143A (en) 1993-01-21 1996-08-13 T. S. I. Medical Device for subcutaneous medication delivery
DK25793A (en) 1993-03-09 1994-09-10 Pharma Plast Int As Infusion set for intermittent or continuous administration of a therapeutic agent
US5536249A (en) 1994-03-09 1996-07-16 Visionary Medical Products, Inc. Pen-type injector with a microprocessor and blood characteristic monitor
US5391250A (en) 1994-03-15 1995-02-21 Minimed Inc. Method of fabricating thin film sensors
US5505709A (en) 1994-09-15 1996-04-09 Minimed, Inc., A Delaware Corporation Mated infusion pump and syringe
IE72524B1 (en) 1994-11-04 1997-04-23 Elan Med Tech Analyte-controlled liquid delivery device and analyte monitor
US5665065A (en) 1995-05-26 1997-09-09 Minimed Inc. Medication infusion device with blood glucose data input
WO1997019188A1 (en) 1995-11-22 1997-05-29 Minimed, Inc. Detection of biological molecules using chemical amplification and optical sensors
US6766183B2 (en) 1995-11-22 2004-07-20 Medtronic Minimed, Inc. Long wave fluorophore sensor compounds and other fluorescent sensor compounds in polymers
US6607509B2 (en) 1997-12-31 2003-08-19 Medtronic Minimed, Inc. Insertion device for an insertion set and method of using the same
DE19717107B4 (en) 1997-04-23 2005-06-23 Disetronic Licensing Ag System of container and drive device for a piston, which is held in the container containing a drug fluid
US6186982B1 (en) 1998-05-05 2001-02-13 Elan Corporation, Plc Subcutaneous drug delivery device with improved filling system
US5954643A (en) 1997-06-09 1999-09-21 Minimid Inc. Insertion set for a transcutaneous sensor
US6558351B1 (en) 1999-06-03 2003-05-06 Medtronic Minimed, Inc. Closed loop system for controlling insulin infusion
JP4063933B2 (en) 1997-12-01 2008-03-19 オリンパス株式会社 Surgery simulation device
US7647237B2 (en) 1998-04-29 2010-01-12 Minimed, Inc. Communication station and software for interfacing with an infusion pump, analyte monitor, analyte meter, or the like
US6175752B1 (en) 1998-04-30 2001-01-16 Therasense, Inc. Analyte monitoring device and methods of use
US6736797B1 (en) 1998-06-19 2004-05-18 Unomedical A/S Subcutaneous infusion set
US6355021B1 (en) 1998-07-14 2002-03-12 Maersk Medical A/S Medical puncturing device
US6554798B1 (en) 1998-08-18 2003-04-29 Medtronic Minimed, Inc. External infusion device with remote programming, bolus estimator and/or vibration alarm capabilities
US6558320B1 (en) 2000-01-20 2003-05-06 Medtronic Minimed, Inc. Handheld personal data assistant (PDA) with a medical device and method of using the same
US6248067B1 (en) 1999-02-05 2001-06-19 Minimed Inc. Analyte sensor and holter-type monitor system and method of using the same
CA2345043C (en) 1998-10-08 2009-08-11 Minimed, Inc. Telemetered characteristic monitor system
US6817990B2 (en) 1998-10-29 2004-11-16 Medtronic Minimed, Inc. Fluid reservoir piston
US7621893B2 (en) 1998-10-29 2009-11-24 Medtronic Minimed, Inc. Methods and apparatuses for detecting occlusions in an ambulatory infusion pump
US7193521B2 (en) 1998-10-29 2007-03-20 Medtronic Minimed, Inc. Method and apparatus for detecting errors, fluid pressure, and occlusions in an ambulatory infusion pump
DE69923858T2 (en) 1998-10-29 2006-01-12 Medtronic MiniMed, Inc., Northridge COMPACT PUMP DRIVE SYSTEM
US6248093B1 (en) 1998-10-29 2001-06-19 Minimed Inc. Compact pump drive system
US7806886B2 (en) 1999-06-03 2010-10-05 Medtronic Minimed, Inc. Apparatus and method for controlling insulin infusion with state variable feedback
US6752787B1 (en) 1999-06-08 2004-06-22 Medtronic Minimed, Inc., Cost-sensitive application infusion device
US6453956B2 (en) 1999-11-05 2002-09-24 Medtronic Minimed, Inc. Needle safe transfer guard
US7003336B2 (en) 2000-02-10 2006-02-21 Medtronic Minimed, Inc. Analyte sensor method of making the same
US6895263B2 (en) 2000-02-23 2005-05-17 Medtronic Minimed, Inc. Real time self-adjusting calibration algorithm
US7890295B2 (en) 2000-02-23 2011-02-15 Medtronic Minimed, Inc. Real time self-adjusting calibration algorithm
US20010041869A1 (en) 2000-03-23 2001-11-15 Causey James D. Control tabs for infusion devices and methods of using the same
US6485465B2 (en) 2000-03-29 2002-11-26 Medtronic Minimed, Inc. Methods, apparatuses, and uses for infusion pump fluid pressure and force detection
US7227526B2 (en) 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
US6740059B2 (en) 2000-09-08 2004-05-25 Insulet Corporation Devices, systems and methods for patient infusion
EP1381408A4 (en) 2001-02-22 2007-06-13 Insulet Corp Modular infusion device and method
EP1383560B2 (en) 2001-04-06 2023-04-26 F. Hoffmann-La Roche AG Infusion set
US20020071225A1 (en) 2001-04-19 2002-06-13 Minimed Inc. Direct current motor safety circuits for fluid delivery systems
US6544212B2 (en) 2001-07-31 2003-04-08 Roche Diagnostics Corporation Diabetes management system
US7399277B2 (en) 2001-12-27 2008-07-15 Medtronic Minimed, Inc. System for monitoring physiological characteristics
US8010174B2 (en) 2003-08-22 2011-08-30 Dexcom, Inc. Systems and methods for replacing signal artifacts in a glucose sensor data stream
US7041082B2 (en) 2002-02-28 2006-05-09 Smiths Medical Md, Inc. Syringe pump control systems and methods
US8010180B2 (en) 2002-03-06 2011-08-30 Mako Surgical Corp. Haptic guidance system and method
US6960192B1 (en) 2002-04-23 2005-11-01 Insulet Corporation Transcutaneous fluid delivery system
US7278983B2 (en) 2002-07-24 2007-10-09 Medtronic Minimed, Inc. Physiological monitoring device for controlling a medication infusion device
US20040068230A1 (en) 2002-07-24 2004-04-08 Medtronic Minimed, Inc. System for providing blood glucose measurements to an infusion device
US6932584B2 (en) 2002-12-26 2005-08-23 Medtronic Minimed, Inc. Infusion device and driving mechanism and process for same with actuator for multiple infusion uses
US7488601B2 (en) 2003-06-20 2009-02-10 Roche Diagnostic Operations, Inc. System and method for determining an abused sensor during analyte measurement
US8275437B2 (en) 2003-08-01 2012-09-25 Dexcom, Inc. Transcutaneous analyte sensor
US7699807B2 (en) 2003-11-10 2010-04-20 Smiths Medical Asd, Inc. Device and method for insertion of a cannula of an infusion device
EP2301428B1 (en) 2003-12-09 2016-11-30 Dexcom, Inc. Signal processing for continuous analyte sensor
GB0329161D0 (en) 2003-12-16 2004-01-21 Precisense As Reagant for detecting an analyte
GB0329849D0 (en) 2003-12-23 2004-01-28 Precisense As Fluorometers
US7344500B2 (en) 2004-07-27 2008-03-18 Medtronic Minimed, Inc. Sensing system with auxiliary display
US8313433B2 (en) 2004-08-06 2012-11-20 Medtronic Minimed, Inc. Medical data management system and process
US7468033B2 (en) 2004-09-08 2008-12-23 Medtronic Minimed, Inc. Blood contacting sensor
AU2006226988B2 (en) 2005-03-21 2011-12-01 Abbott Diabetes Care, Inc. Method and system for providing integrated medication infusion and analyte monitoring system
EP1877116A1 (en) 2005-04-13 2008-01-16 Novo Nordisk A/S Medical skin mountable device and system
US9943372B2 (en) 2005-04-18 2018-04-17 M.S.T. Medical Surgery Technologies Ltd. Device having a wearable interface for improving laparoscopic surgery and methods for use thereof
US8137314B2 (en) 2006-08-23 2012-03-20 Medtronic Minimed, Inc. Infusion medium delivery device and method with compressible or curved reservoir or conduit
US20080097291A1 (en) 2006-08-23 2008-04-24 Hanson Ian B Infusion pumps and methods and delivery devices and methods with same
US8277415B2 (en) 2006-08-23 2012-10-02 Medtronic Minimed, Inc. Infusion medium delivery device and method with drive device for driving plunger in reservoir
US7686787B2 (en) 2005-05-06 2010-03-30 Medtronic Minimed, Inc. Infusion device and method with disposable portion
US7713240B2 (en) 2005-09-13 2010-05-11 Medtronic Minimed, Inc. Modular external infusion device
DE602006008494D1 (en) 2005-11-08 2009-09-24 M2 Medical As INFUSION PUMP SYSTEM
US8114269B2 (en) 2005-12-30 2012-02-14 Medtronic Minimed, Inc. System and method for determining the point of hydration and proper time to apply potential to a glucose sensor
US7985330B2 (en) 2005-12-30 2011-07-26 Medtronic Minimed, Inc. Method and system for detecting age, hydration, and functional states of sensors using electrochemical impedance spectroscopy
US8114268B2 (en) 2005-12-30 2012-02-14 Medtronic Minimed, Inc. Method and system for remedying sensor malfunctions detected by electrochemical impedance spectroscopy
US20070255125A1 (en) 2006-04-28 2007-11-01 Moberg Sheldon B Monitor devices for networked fluid infusion systems
US8589824B2 (en) 2006-07-13 2013-11-19 Northrop Grumman Systems Corporation Gesture recognition interface system
US7828764B2 (en) 2006-08-23 2010-11-09 Medtronic Minimed, Inc. Systems and methods allowing for reservoir filling and infusion medium delivery
US7789857B2 (en) 2006-08-23 2010-09-07 Medtronic Minimed, Inc. Infusion medium delivery system, device and method with needle inserter and needle inserter device and method
US20080125700A1 (en) 2006-11-29 2008-05-29 Moberg Sheldon B Methods and apparatuses for detecting medical device acceleration, temperature, and humidity conditions
US7946985B2 (en) 2006-12-29 2011-05-24 Medtronic Minimed, Inc. Method and system for providing sensor redundancy
US20080269714A1 (en) 2007-04-25 2008-10-30 Medtronic Minimed, Inc. Closed loop/semi-closed loop therapy modification system
US8323250B2 (en) 2007-04-30 2012-12-04 Medtronic Minimed, Inc. Adhesive patch systems and methods
US7963954B2 (en) 2007-04-30 2011-06-21 Medtronic Minimed, Inc. Automated filling systems and methods
ES2715604T3 (en) 2007-07-20 2019-06-05 Hoffmann La Roche Manual portable infusion device
US8726194B2 (en) 2007-07-27 2014-05-13 Qualcomm Incorporated Item selection using enhanced control
US8723795B2 (en) 2008-04-24 2014-05-13 Oblong Industries, Inc. Detecting, representing, and interpreting three-space input: gestural continuum subsuming freespace, proximal, and surface-contact modes
US8207859B2 (en) 2008-04-28 2012-06-26 Medtronic Minimed, Inc. Automobile physiological monitoring system and method for using the same
EP2320990B2 (en) 2008-08-29 2023-05-31 Corindus, Inc. Catheter control system and graphical user interface
US8181849B2 (en) 2008-12-30 2012-05-22 Medtronic Minimed, Inc. Color detection system for detecting reservoir presence and content in device
FI20095299A7 (en) 2009-03-23 2010-09-24 Palodex Group Oy System for managing an image plate and its image data and method for controlling the system
US8308679B2 (en) 2009-12-30 2012-11-13 Medtronic Minimed, Inc. Alignment systems and methods
US20110137141A1 (en) * 2009-12-03 2011-06-09 At&T Intellectual Property I, L.P. Wireless Monitoring of Multiple Vital Signs
JP5728159B2 (en) 2010-02-02 2015-06-03 ソニー株式会社 Image processing apparatus, image processing method, and program
US8361031B2 (en) 2011-01-27 2013-01-29 Carefusion 303, Inc. Exchanging information between devices in a medical environment
US9207767B2 (en) 2011-06-29 2015-12-08 International Business Machines Corporation Guide mode for gesture spaces
JP6715761B2 (en) 2013-03-15 2020-07-01 ベクトン・ディキンソン・アンド・カンパニーBecton, Dickinson And Company High performance adapter for injection devices
WO2014191036A1 (en) 2013-05-29 2014-12-04 Brainlab Ag Gesture feedback for non-sterile medical displays
US9595208B2 (en) 2013-07-31 2017-03-14 The General Hospital Corporation Trauma training simulator with event-based gesture detection and instrument-motion tracking
US9220463B2 (en) * 2013-10-29 2015-12-29 General Electric Comapny System and method of workflow management
EP3062865B1 (en) 2013-11-01 2022-05-25 Becton, Dickinson and Company Injection device configured to mate with a mobile device
EP4079251A1 (en) 2014-03-17 2022-10-26 Intuitive Surgical Operations, Inc. Guided setup for teleoperated medical device
CA2936774C (en) * 2014-04-10 2024-02-06 Dexcom, Inc. Glycemic urgency assessment and alerts interface
US10232113B2 (en) 2014-04-24 2019-03-19 Medtronic Minimed, Inc. Infusion devices and related methods and systems for regulating insulin on board
WO2016047173A1 (en) 2014-09-24 2016-03-31 オリンパス株式会社 Medical system
US9878097B2 (en) 2015-04-29 2018-01-30 Bigfoot Biomedical, Inc. Operating an infusion pump system
US11037070B2 (en) 2015-04-29 2021-06-15 Siemens Healthcare Gmbh Diagnostic test planning using machine learning techniques
US10600015B2 (en) 2015-06-24 2020-03-24 Karl Storz Se & Co. Kg Context-aware user interface for integrated operating room
CA3015179A1 (en) 2016-03-08 2016-12-08 Antisep - Tech Ltd. Method and system for monitoring activity of an individual
KR20170104819A (en) 2016-03-08 2017-09-18 삼성전자주식회사 Electronic device for guiding gesture and gesture guiding method for the same
JP7361470B2 (en) 2016-06-20 2023-10-16 バタフライ ネットワーク,インコーポレイテッド Automatic image acquisition to assist the user in operating the ultrasound device
US11197949B2 (en) 2017-01-19 2021-12-14 Medtronic Minimed, Inc. Medication infusion components and systems
US20210327304A1 (en) 2017-01-24 2021-10-21 Tienovix, Llc System and method for augmented reality guidance for use of equpment systems
US11830614B2 (en) 2017-03-20 2023-11-28 Opticsurg, Inc. Method and system for optimizing healthcare delivery
WO2018227163A1 (en) 2017-06-09 2018-12-13 Companion Medical, Inc. Intelligent medication delivery systems and methods
US11672621B2 (en) 2017-07-27 2023-06-13 Intuitive Surgical Operations, Inc. Light displays in a medical device
US11344235B2 (en) 2017-09-13 2022-05-31 Medtronic Minimed, Inc. Methods, systems, and devices for calibration and optimization of glucose sensors and sensor output
EP4376020A3 (en) * 2018-02-09 2024-08-14 DexCom, Inc. System for decision support
WO2019204395A1 (en) 2018-04-17 2019-10-24 Marchand Stacey Leighton Augmented reality spatial guidance and procedure control system
JP7271579B2 (en) 2018-06-19 2023-05-11 ホウメディカ・オステオニクス・コーポレイション Surgical support using mixed reality support in orthopedic surgery
US11177025B2 (en) 2018-06-20 2021-11-16 International Business Machines Corporation Intelligent recommendation of useful medical actions
US10383694B1 (en) 2018-09-12 2019-08-20 Johnson & Johnson Innovation—Jjdc, Inc. Machine-learning-based visual-haptic feedback system for robotic surgical platforms
US11468998B2 (en) 2018-10-09 2022-10-11 Radect Inc. Methods and systems for software clinical guidance
US11367516B2 (en) 2018-10-31 2022-06-21 Medtronic Minimed, Inc. Automated detection of a physical behavior event and corresponding adjustment of a medication dispensing system
US20200289373A1 (en) 2018-10-31 2020-09-17 Medtronic Minimed, Inc. Automated detection of a physical behavior event and corresponding adjustment of a physiological characteristic sensor device
US11191899B2 (en) 2019-02-12 2021-12-07 Medtronic Minimed, Inc. Infusion systems and related personalized bolusing methods
CN114206207A (en) 2019-08-02 2022-03-18 雅培糖尿病护理公司 Systems, devices and methods related to medication dose guidance
US11241537B2 (en) 2019-09-20 2022-02-08 Medtronic Minimed, Inc. Contextual personalized closed-loop adjustment methods and systems
US20210178063A1 (en) 2019-12-13 2021-06-17 Medtronic Minimed, Inc. Controlling medication delivery system operation and features based on automatically detected user stress level
US11488700B2 (en) 2019-12-13 2022-11-01 Medtronic Minimed, Inc. Medical device configuration procedure guidance responsive to detected gestures

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180185578A1 (en) * 2014-12-04 2018-07-05 Medtronic Minimed, Inc. Methods for operating mode transitions and related infusion devices and systems
US20170091259A1 (en) * 2015-09-29 2017-03-30 Ascensia Diabetes Care Holdings Ag Methods and apparatus to reduce the impact of user-entered data errors in diabetes management systems
US20170220772A1 (en) * 2016-01-28 2017-08-03 Savor Labs, Inc. Method and apparatus for tracking of food intake and other behaviors and providing relevant feedback
US20180174675A1 (en) * 2016-12-21 2018-06-21 Medtronic Minimed, Inc. Infusion systems and methods for prospective closed-loop adjustments
US20200104039A1 (en) * 2018-09-28 2020-04-02 Snap Inc. Neural network system for gesture, wear, activity, or carry detection on a wearable or mobile device
US20210100951A1 (en) * 2019-10-04 2021-04-08 Arnold Chase Controller based on lifestyle event detection

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230013632A1 (en) * 2016-05-02 2023-01-19 Dexcom, Inc. System and method for providing alerts optimized for a user
US11837348B2 (en) * 2016-05-02 2023-12-05 Dexcom, Inc. System and method for providing alerts optimized for a user
US12315614B2 (en) 2016-05-02 2025-05-27 Dexcom, Inc. System and method for providing alerts optimized for a user
US11488700B2 (en) * 2019-12-13 2022-11-01 Medtronic Minimed, Inc. Medical device configuration procedure guidance responsive to detected gestures
US12362055B2 (en) 2019-12-13 2025-07-15 Medtronic Minimed, Inc. Medical device configuration procedure guidance responsive to detected gestures
US12369819B2 (en) 2019-12-13 2025-07-29 Medtronic Minimed, Inc. Multi-sensor gesture-based operation of a medication delivery system
US20230071908A1 (en) * 2020-02-10 2023-03-09 Prevayl Innovations Limited Wearable article
US12419576B2 (en) * 2020-02-10 2025-09-23 Prevayl Innovations Limited Wearable article
US20220143315A1 (en) * 2020-11-10 2022-05-12 Baxter International Inc. System and method for extending the storage duration of a rechargeable battery of an infusion pump
US12427250B2 (en) * 2020-11-10 2025-09-30 Baxter International Inc. System and method for extending the storage duration of a rechargeable battery of an infusion pump
US20240196336A1 (en) * 2022-12-12 2024-06-13 Dexcom, Inc. Variable power transmission for battery-powered devices

Also Published As

Publication number Publication date
US20210177306A1 (en) 2021-06-17
US12369819B2 (en) 2025-07-29
US20250318751A1 (en) 2025-10-16

Similar Documents

Publication Publication Date Title
US20210178063A1 (en) Controlling medication delivery system operation and features based on automatically detected user stress level
US12478732B2 (en) Alert management based on sleeping status
US12433998B2 (en) Activity mode for artificial pancreas system
US11887712B2 (en) Method and system for classifying detected events as labeled event combinations for processing at a client application
US20250001078A1 (en) Translating therapy parameters of an insulin therapy system to translated therapy parameters for use at a different insulin therapy system
US11786655B2 (en) Context-sensitive predictive operation of a medication delivery system in response to gesture-indicated activity changes
US20240245860A1 (en) Controlling medication delivery system operation and features based on automatically detected muscular movements
US20210178069A1 (en) Controlling insulin delivery to a specific user by a medical device based on detected events associated with the specific user
US11488700B2 (en) Medical device configuration procedure guidance responsive to detected gestures
US11710562B2 (en) Gesture-based control of diabetes therapy
US20250186695A1 (en) Synergistic features and functions related to operation of a medication delivery system and a meal transaction application
US12478286B2 (en) Synergistic features and functions related to operation of a medication delivery system and a physical activity detection system
CN116261756A (en) Automatic disabling of diabetes status alerts and automatic predictive mode switching of glucose levels

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

AS Assignment

Owner name: MEDTRONIC MINIMED, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PARIKH, NEHA J.;MONIRABBASI, SALMAN;SIGNING DATES FROM 20210415 TO 20210422;REEL/FRAME:056506/0682

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION