WO2025213278A1 - Wearable motion interface for controlling robotic systems - Google Patents
Wearable motion interface for controlling robotic systemsInfo
- Publication number
- WO2025213278A1 WO2025213278A1 PCT/CA2025/050536 CA2025050536W WO2025213278A1 WO 2025213278 A1 WO2025213278 A1 WO 2025213278A1 CA 2025050536 W CA2025050536 W CA 2025050536W WO 2025213278 A1 WO2025213278 A1 WO 2025213278A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- wearable
- user
- wearable interface
- signals
- sensors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J18/00—Arms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J18/00—Arms
- B25J18/06—Arms flexible
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/0091—Shock absorbers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/06—Safety devices
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/09—Rehabilitation or training
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/22—Ergometry; Measuring muscular strength or the force of a muscular blow
- A61B5/224—Measuring muscular strength
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
Definitions
- the embodiments disclosed herein relate to wearable devices for health monitoring and motion control, and more particularly to integrated systems that incorporate smart materials, conductive textiles, and multimodal sensors for controlling robotic systems. This includes exoskeletons, collaborative robots, humanoids, and other robotics applications that benefit from real-time human input and feedback.
- Wearable technology has seen rapid advancement, driven by the demand for continuous physiological monitoring (e.g., heart rate, body temperature, or electrocardiograms) and motion tracking (e.g., step counting, posture correction).
- continuous physiological monitoring e.g., heart rate, body temperature, or electrocardiograms
- motion tracking e.g., step counting, posture correction
- most conventional wearables provide only unidirectional data acquisition — often lacking robust, low-latency control signals for external systems.
- these devices typically rely on rigid components or shallow sensor integration, which can reduce user comfort and limit the accuracy of sensing, particularly during dynamic activities.
- the present disclosure provides a wearable interface system configured for bi-directional interaction between a human user and a robotic platform.
- the system comprises a textile-based substrate incorporating smart materials, conductive threads, and a plurality of integrated sensors operable to acquire real-time physiological and biomechanical signals.
- the textile substrate may include multilayer fabrics embedded with smart material micro-strands and conductive fibers, forming a conformable and breathable architecture adapted for continuous wear.
- Sensor nodes may be strategically integrated to capture electromyographic (EMG) signals, electrocardiographic (ECG) activity, inertial motion data, and capacitive or resistive pressure inputs.
- EMG electromyographic
- ECG electrocardiographic
- One or more embedded microcontrollers optionally comprising edge artificial intelligence (Al) processors, are operable to filter and interpret incoming signals, predict user intent, and generate control instructions for one or more robotic systems.
- the system may further include a bi-directional feedback mechanism comprising haptic actuators and smart material-based stiffness modulation components. These are configured to respond to incoming signals from the robotic system and deliver tactile or structural cues to the user in real-time.
- the system supports modular deployment using adhesive sensor patches, repositionable modules, and energy-harvesting elements adapted for off-grid or extended operation.
- FIG. 1 is a sectional diagram of a wearable interface, according to an embodiment.
- FIG. 2 is a diagram of a wearable garment interface, according to an embodiment
- FIGS. 3A-3B are diagrams of the capture of physiological and motion signals from wearable interfaces transmitted as control commands to a robotic system
- FIG. 4 is diagram of a control feedback loop between a wearable interface and a robotic system, according to an embodiment
- FIG. 5 is a flow chart of integrating real-world and synthetic physiological sensor data to train and optimize a machine learning model, according to an embodiment
- FIG. 6 is a flow chart of a wearable interface signal pathway, according to an embodiment.
- FIG. 7 is a flow chart of bi-directional data flow between a wearable interface and a robotic system, according to another embodiment.
- One or more systems described herein may be implemented in computer programs executing on programmable computers, each comprising at least one processor, a data storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
- the programmable computer may be a programmable logic unit, a mainframe computer, server, and personal computer, cloud based program or system, laptop, personal data assistance, cellular telephone, smartphone, or tablet device.
- Each program is preferably implemented in a high level procedural or object oriented programming and/or scripting language to communicate with a computer system.
- the programs can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language.
- Each such computer program is preferably stored on a storage media or a device readable by a general or special purpose programmable computer for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein.
- microcontroller refers to computer processors (e.g., central processing units, graphics processing units), integrated circuits, systems-on-a- chip, including associated hardware and software, being configured to execute instructions, perform calculations and/or process signals/data as the case may be.
- FIG. 1 shown therein is a cross-sectional schematic illustration of a multi-layered wearable interface 100, according to an embodiment.
- the wearable interface is generally constructed as a bioelectronic mesh 100.
- the bioelectronic mesh 100 integrates a combination of advanced functional materials, embedded electronics, and sensory elements into a conformable textile platform, enabling robust physiological monitoring and real-time bidirectional communication with external robotic systems.
- An innermost sensing layer 118 includes one or more physiological sensors 102 configured to capture bio-signals such as electromyographic (EMG), electrocardiographic (EKG/ECG), and motion data (IMU, piezoresistive sensors). These sensors 102 may be arranged in any suitable arrangement on the sensing layer 118 depending on the target (wearer’s) anatomy or intended application.
- the sensors 102 are disposed on or embedded in a flexible printed circuit board (PCBs) material 104 (micro-wiring pathways, flex PCBs, 3D printed liquid ink, etc.), which provides signal routing pathways and mechanical compliance during user movement but also remain stretchable and comfortable during wear.
- the sensors 102 further include bioimpedance sensors and/or temperature sensors.
- Adjacent to or integrated with the PCB 104 is one or more antennas 106, configured for wireless communication using Bluetooth, Wi-Fi, near-field communication (NFC), 5G or other protocols.
- the antenna 106 may also be adapted to receive inductive power for wireless charging of onboard electronics.
- the wearable interface 100 includes at least one microcontroller 120 operably coupled to the sensors 102 and antennas 106.
- the microcontroller 120 is configured for local signal processing and feature extraction using machine learning algorithms, generating motion-intent or health-state control signals and transmitting the control signals to a robotic system via a wireless communication interface.
- Additional microcontrollers 120 can be substituted in place of at least one of the physiological sensors (EMG, ECG/EKG) 102 and motion sensors (IMU, piezoresistive sensors) for real-time data filtering, local Al capabilities reducing latency by interpreting signals-on board, enabling immediate control adjustments without requiring cloud-based processing.
- EMG ECG
- ECG/EKG ECG/EKG
- IMU piezoresistive sensors
- An outer textile substrate 108 is formed from an elastic, breathable, and stretchable fabric, serving both protective and functional roles. Interwoven within this fabric 108 are conductive threads 110, for example, silver-coated nylon, carbon nanofibers, or graphene fibers, graphene-infused fibres or polymer-based conductive fibres, which act as electrical interconnects and sensing electrodes. Embedded in parallel to the conductive threads are smart materials such as electroactive polymers, dielectric elastomers, and shape memory alloys micro-strands 112. The smart material elements 112 dynamically adjust their stiffness and/or shape in response to electrical current or thermal input, allowing the wearable interface 100 to provide variable resistance, dynamic posture support, or real-time corrective feedback based on sensor 102 input or robotic response.
- conductive threads 110 for example, silver-coated nylon, carbon nanofibers, or graphene fibers, graphene-infused fibres or polymer-based conductive fibres, which act as electrical interconnects and sensing electrodes.
- smart materials
- haptic actuator wires 114 Positioned throughout the mesh 100 are haptic actuator wires 114, which may be connected to different actuation mechanisms to deliver localized tactile cues.
- the haptic elements 114 allow the interface 100 to communicate alerts, guidance, or feedback signals directly to the user/wearer.
- piezoelectric modules 116 are integrated within the fabric 108 for dual purposes: the piezoelectric modules are configured as biomechanical signal generators when exposed to user movement (e.g., gait cycles or joint motion), or as responsive elements that deliver haptic feedback through mechanical deformation or as pressure sensors. Together, these components form a cohesive, adaptive textile system capable of interpreting user intent, transmitting control commands, and providing tactile or structural feedback as part of a closed-loop wearable-robotic interface 100.
- FIG. 2 shown therein is an exemplary wearable garment 200 constructed from the bioelectronic mesh, illustrating its integration into various form factors.
- the wearable 200 is a t-shirt 200 fabricated from the bioelectronic mesh, demonstrating the modular and scalable nature of the wearable interface system 100 for broader anatomical coverage.
- the wearable 200 incorporates wireless communication module 202, which may support Bluetooth, Wi-Fi, NFC, or 5G connectivity, enabling low-latency communication with robotic systems, companion applications, or cloud-based health platforms.
- the wearable 200 also includes haptic actuator wires 114 embedded throughout the fabric, configured to deliver tactile feedback in response to external signals, such as robotic status changes or guided motion cues.
- a dedicated region 204 serves as a placeholder for one or more embedded processing modules, such as microcontrollers, edge Al processors, and motor drivers. This modular region 204 allows for easy integration of computing and control hardware, enabling on-device signal processing, machine-learning inference, and local actuation control.
- the garment further integrates physiological sensor regions 102a, 102b, having EMG and EKG/ECG pads respectively. These sensors 102a, 102b are strategically placed to monitor muscle activity, cardiac signals, and related bio-signals during movement or stationary phases when the wearable 200 is worn. Additionally, the wearable 200 is designed to accommodate interaction with external users, for example, enabling physiological measurements when a person places their hand or other body part against the fabric surface.
- the garment may include additional externally-facing contact-based sensors such as optical heart rate sensors, blood oxygen (SpO2) sensors, or skin temperature sensors that are embedded in specific areas corresponding to the stimulus measured. These sensors are activated upon skin contact and are capable of capturing metrics such as heart rate, blood oxygen saturation, or galvanic skin response.
- the garment can detect and interpret biometric data from individuals other than the primary wearer, thereby enabling broader use cases in remote care, triage, or physical interaction scenarios.
- the bioelectronic mesh 100 may also be provided as a variety of other wearable garments, accessories and form factors to support distinct sensing and control applications. These include garments such as t-shirts, sleeves, and wraps that cover larger muscle groups, enabling broader motion detection, force estimation, and posture analysis. Patches and adhesive strips may be affixed to localized anatomical areas such as the forearm, chest, or calf to enable targeted measurements and may include energyharvesting technologies such as thermoelectric or triboelectric generators for self- powered operation. Self-powered patches incorporating triboelectric or thermoelectric generators may enable extended operation in off-grid or resource-constrained scenarios. Additionally, wearable accessories such as rings, bracelets, or insoles may be employed for tracking fine motor activity (e.g., finger gestures) or foot pressure distribution particularly useful in humanoid robotics, rehabilitation, and assistive or hospitality-related applications.
- garments such as t-shirts, sleeves, and wraps that cover larger muscle groups, enabling broader motion detection,
- FIGS. 3A-3B shown therein are diagrams of a system 300 showing the interconnectivity and signal flow between a wearable bioelectronic garment 200 and a robotic system 302, according to an embodiment.
- the system 300 demonstrates how physiological and motion signals captured by the wearable are processed locally and wirelessly transmitted as control commands/signals to the robotic platform.
- the wearable garment 200 and robotic system 302 are wirelessly paired via a communication channel 304 that may include one or more of Bluetooth, Wi-Fi, NFC, or 5G technologies, as described in FIGS. 1 and 2.
- signals from embedded physiological sensors such as EMG electrodes, EKG/ECG pads, and inertial measurement units — are acquired in real-time.
- the communication channel 304 provides for the live flow of intent-based command signals. These raw signals are digitized and filtered by local microcontrollers embedded in the garment.
- Integrated Al or machinelearning models analyze the signals to determine muscle activation thresholds, joint angles, or posture deviations, and subsequently generate motion-intent control commands.
- the control commands include both continuous movement instructions and discrete gesture-based triggers.
- the robotic system 302 may incorporate its own signal processing modules that receive, validate, and apply the control data received from the wearable 200.
- the robotic control architecture may include PID controllers, model predictive control, or Al- based motion planning systems, with wearable-derived inputs treated as high-priority commands. Based on these inputs, the robot 302 adjusts joint torques or actuator trajectories accordingly.
- the robotic system 302 may return feedback signals to the wearable 200. These may include force/torque sensor data, joint angle verification, or safety status indicators.
- the wearable 200 may actuate its own embedded feedback modules including haptic actuators (e.g., vibration motors, smart materials such as electroactive polymers, dielectric elastomers, or shape memory alloys) disposed at anatomical cue points to provide real-time cues to the user to guide motion or provide alerts, ensuring safe and synchronized operation.
- haptic actuators e.g., vibration motors, smart materials such as electroactive polymers, dielectric elastomers, or shape memory alloys
- smart material strands may be triggered to provide posture correction or resistive feedback if the robot 302 encounters an unexpected obstacle or deviation.
- This closed-loop feedback and control architecture enhances shared autonomy, enabling collaborative operation modes between human users wearing the wearable 200 and robotic systems 302.
- the system 300 may also support multi-user coordination scenarios, where multiple wearable-equipped users interact with a common robotic platform 302. Secure encryption protocols may be used throughout the data exchange process to maintain safety, data integrity, and operational reliability across the wearable-robot ecosystem.
- FIG. 4 shown therein is a diagram of a system integration 400 from a robot-centric perspective, highlighting the data flow between the wearable interface 200 and a robotic platform 402.
- the system 400 demonstrates how the wearable 200 serves as an upstream command source and downstream feedback receiver, forming a continuous bi-directional control loop.
- the wearable 200 and robotic system 402 are wirelessly paired using embedded communication modules capable of Bluetooth, Wi-Fi, NFC, or 5G protocols.
- the wearable 200 Upon detecting physiological activity such as EMG signals or posture deviation, the wearable 200 transmits control commands 404 to the robotic system 402.
- These commands 404 may include robotic arm motion vectors, grip force parameters, or camera positioning instructions, derived from the user’s intent as received by the sensors in the wearable 200.
- the robotic system 402 provides real-time feedback 406 to the wearable 200.
- This feedback 406 may include sensor-derived information such as joint torque, load status, proximity alerts, or safety triggers.
- the wearable 200 interprets these signals and activates one or more embedded devices to communicate information back to the user. For example, localized vibration cues or smart material induced textile tensioning may guide the user’s movement or posture in response to the robot’s environment or operational state.
- the robotic system’s 402 control architecture may employ advanced algorithms including PID loops, model predictive control, or learning-based motion planners.
- Inverse kinematics modules translate wearable-derived signals into joint-level commands, ensuring the robot mirrors human intent in real time. Filtering and contextual validation steps further safeguard against errant inputs or unsafe conditions.
- This architecture enables shared autonomy and hybrid control paradigms, where human input augments robotic intelligence.
- multiple users equipped with such wearables 200 may interface with a robotic system 402 collaboratively, ensuring coordinated motion and balanced load distribution. Redundant safety mechanisms and encrypted communication protocols ensure robust, fail-safe operation across the human-robot interface.
- FIG. 5 shown therein is a block diagram 500 that illustrates an Al training and optimization process 500 enabled by the combined use of real-world data and simulated environments from both the wearable garment 200 and robotic systems 302, 402, according to an embodiment.
- the optimization process facilitates the continuous refinement of embedded Al models used to interpret physiological signals and issue robotic control commands, ultimately enhancing system performance and user experience.
- real-world data is generated directly from the wearable device 502 during normal operation. This includes raw sensor signals from EMG, EKG/ECG, IMlls, capacitive touch, and other modalities.
- the data is subsequently extracted, filtered, and annotated 506 through pre-processing pipelines that ensure its usability and quality for Al model training.
- the annotated real-world data 510 is compiled into training datasets used to improve motion-intent recognition, healthstate detection, and robotic coordination.
- a simulated environment is constructed using a digital twin 504 of the wearable system and its integrated sensors. This virtual model enables the generation of synthetic sensor data through controlled simulations that mimic physical interactions, user movements, and varying operational conditions.
- GANs Generative Adversarial Networks
- VAEs Variational Autoencoders
- Diffusion Models Diffusion Models
- Both real-world and synthetic datasets 510, 512 may be combined 514 using Sim2Real alignment techniques, ensuring cross-domain consistency and enhancing generalization. This fusion process may involve domain adaptation methods, statistical harmonization, or latent space matching to bridge the gap between physical and virtual datasets.
- the integrated dataset 514 is then used for Al model training and refinement through Transfer Learning 516.
- Techniques such as CNN-LSTM hybrids, Transformerbased sequence modeling, and fine-tuning are employed to adapt pre-trained models to the specific signal distributions and task requirements of the wearable-robot system.
- the models undergo optimization and validation 518. This includes procedures such as hyperparameter tuning, Bayesian optimization, and performance benchmarking under variable conditions.
- the result is an adaptive Al system capable of real-time intent inference and responsive robotic actuation, continuously learning from both simulated and real-world interactions.
- FIG. 6 shown therein is a block diagram illustrating the signal pathway 600 and real-time processing architecture used to provide user feedback in response to actions detected by the wearable garment 200.
- the process 600 begins when a user initiates a motion/action or physiological response 602.
- sensors integrated into the wearable 200 such as EMG and EKG/ECG electrodes 604, inertial measurement units (IMlls) 606, temperature sensors 608, and piezoresistive pressure sensors 610 — generate multimodal data that captures the user’s physiological state and biomechanical movement.
- IMlls inertial measurement units
- This raw sensor data is transmitted to and interpreted by an embedded microcontroller 612 or an external cloud-based processing unit.
- the system then performs contextual analysis by aggregating, filtering, and aligning sensor streams in real time 614.
- Signal fusion and interpretation modules apply algorithms that combine data from heterogeneous sources to extract actionable insights. These techniques may include statistical fusion, time-synchronized filtering, pattern recognition, and event correlation frameworks.
- the output of the fusion process enables the system to detect and classify user states and events such as posture deviation, exertion thresholds, and significant motion gestures 616. These events may trigger feedback mechanisms embedded in the wearable, including haptic cues, smart material induced stiffness adjustments, or other context-specific alerts.
- the architecture supports both threshold-based detection and event-driven interrupt protocols, ensuring low-latency responsiveness to user behavior while maintaining robust contextual awareness.
- This processing pipeline 600 enhances the intelligence and responsiveness of the wearable system, allowing it to operate autonomously or in tandem with robotic systems to close the feedback loop between sensed user actions and real-time actuation or guidance.
- FIG. 7 shown therein is a diagram illustrating the bi-directional data flow 700 between the wearable interface/garment 100, 200 and the robotic system 302, 402, emphasizing real-time signal transmission, distributed Al processing, and closed-loop actuation.
- an event trigger 702 is persistently generated in parallel with the real-time streaming of filtered physiological and motion data 704 from the wearable's onboard sensors and processing units.
- Both the event trigger 702 and streaming data 704 are transmitted simultaneously through encrypted wireless communication channels to two destinations: the robotic system 710 and cloud server 708.
- the incoming data is processed locally by the onboard Al engine and computer 714 to enable low- latency control decisions.
- the cloud platform 712 performs additional processing and analysis using larger computational resources for extended learning, anomaly detection, and performance optimization.
- Bi-directional wireless feedback 716 is continuously delivered to the wearable user 720, which may include haptic signals, garment stiffness modulation, or visual/audio cues, depending on the robot’s state and external conditions.
- the robot’s processed outputs are routed to its internal actuation and control system 718, enabling context-aware responses such as motion commands, force adjustments, or behavioural modifications.
- the Framework 700 thus provides a real-time feedback architecture that leverages simultaneous local and remote Al processing to ensure adaptive, intelligent, and responsive interaction between human operators and robotic systems.
- the disclosed wearable health tracker and motion interface may be employed across multiple domains to facilitate precise, realtime control of robotic systems.
- the system interprets electromyographic (EMG) and motion signals generated by the user to deliver actionable control instructions to collaborative robots, surgical systems, warehouse automation platforms, and food preparation robots.
- EMG electromyographic
- the robotic systems may provide haptic feedback, motion cues, or garment-level actuation to close the control loop.
- a collaborative robot is controlled via the user’s intent as interpreted from EMG signals and inertial motion data from the wearable.
- the cobot may be configured to assist with tasks such as maneuvering heavy components, performing alignments, or supporting the user during repetitive lifting operations.
- the robotic system transmits force alerts or corrective alignment cues back to the wearable.
- the garment actuates localized haptic modules or dynamically adjusts smart materials’ stiffness to notify or assist the user.
- the wearable 200 may be used in remote surgical procedures or telepresence operations. Subtle gestures and muscle contractions are captured by the wearable interface and transmitted to a robotic surgical platform, enabling accurate manipulation of tools in real-time. In scenarios involving tissue resistance or abrupt tool feedback, the robotic system may trigger alerts which are then conveyed through haptic actuators or smart materials’ tensioning mechanisms embedded within the wearable, thereby maintaining the surgeon’s spatial and tactile awareness during delicate interventions.
- the wearable 200 facilitates hands-free control of autonomous or semi-autonomous logistics robots.
- warehouse operators equipped with the wearable may issue directional or retrieval commands via motion intent, enabling efficient item selection and transport.
- Feedback loops may alert the operator to completion of tasks, obstructions, or improper alignment via garment-based vibrations or stiffness adjustments.
- the wearable 200 may be employed in industrial kitchens or food production environments where repetitive and precision-based tasks are automated.
- a robotic arm may be controlled by the user to perform stirring, slicing, or dispensing operations. As the robot encounters variations in viscosity, resistance, or load, this data is communicated back to the wearable and translated into tactile cues that help the user adapt movement instructions in real-time, ensuring consistent output and safe operation.
- the integration of artificial intelligence (Al) into wearable-controlled robotic systems enhances system responsiveness, adaptability, prediction, and precision.
- the methodologies described herein are configured to transform multimodal sensor data from the wearable garment — such as that captured in the systems of FIGS. 1-3 into refined robotic control signals that adapt in real time to user input.
- imitation learning enables the robotic system to replicate complex human motions based on sensor-rich data collected from the wearable interface, as illustrated in FIG. 3.
- EMG electrodes, inertial sensors, and capacitive touch sensors continuously record the user's movements during task performance. Behaviour cloning models are used to map these recorded trajectories directly to robotic motion control parameters.
- the system applies Inverse Reinforcement Learning (IRL) to uncover the user's underlying objectives or Generative Adversarial Imitation Learning (GAIL) to improve generalization in diverse operational contexts.
- INL Inverse Reinforcement Learning
- GAIL Generative Adversarial Imitation Learning
- the Al system leverages temporal sensor data sequences, including those shown in the signal streaming pathways of FIG. 6, to forecast imminent user actions.
- Recurrent neural networks (RNNs), long short-term memory (LSTM) networks, and transformer-based architectures are trained to anticipate limb movement, gesture transitions, or intent shifts based on previous motion patterns.
- RNNs Recurrent neural networks
- LSTM long short-term memory
- transformer-based architectures are trained to anticipate limb movement, gesture transitions, or intent shifts based on previous motion patterns.
- the robotic system can prepare actuation pathways or alter tool trajectories, significantly reducing operational latency and improving flow during human-robot collaboration.
- reinforcement learning frameworks optimize robotic behaviours by learning from the environment and user interaction data transmitted through the wearable interface.
- Feedback loops represented in FIG. 7 as bi-directional wireless feedback 716 — serve as real-time reward signals for tuning robotic actions such as grip strength, velocity, or trajectory.
- Wearable-based signals like smart materials - such as electroactive polymers, dielectric elastomers, or shape memory alloys - tensioning or localized haptic cues can guide users to optimal behavior, while the robot adapts via continuous reinforcement updates.
- the system integrates signals from multiple sensor modalities, such as those depicted in FIGS. 1 and 2, to generate robust control outputs.
- Data from EMG electrodes, IMUs, piezoresistive layers, and temperature sensors are aggregated and synchronized using deep learning fusion models that may include convolutional neural networks (CNNs), attention-based transformers, or probabilistic graphical models like hidden Markov models (HMMs).
- CNNs convolutional neural networks
- HMMs probabilistic graphical models like hidden Markov models
- semantic models are trained to infer higher-level user intentions from sensor input sequences.
- Al classifiers can interpret motion cues to identify task-level goals such as lifting, gesturing, or halting. This enables the robotic system to execute actions autonomously or in coordination with the user’s predicted intent, enhancing the efficiency of collaborative workflows.
- optimized Al models are deployed directly on embedded processors within the wearable interface or its companion controller, such as the module placements shown in FIG. 2.
- Techniques like model quantization, knowledge distillation, and parameter pruning ensure efficient inference even under resource constraints.
- sensor data captured from the wearable e.g., posture deviations or EMG fatigue patterns
- EHRs electronic health records
- FIG. 6 sensor fusion and classification blocks 614 reflect the underlying infrastructure that can support these broader medical analytics.
- a digital twin of the wearable device as outlined in FIG. 5’s synthetic data generation stream 508, is created to simulate sensor data under variable operational conditions.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Medical Informatics (AREA)
- Epidemiology (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- General Business, Economics & Management (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Physical Education & Sports Medicine (AREA)
- Manipulator (AREA)
Abstract
A wearable interface system for controlling robotic systems is provided. The wearable interface is configured for bi-directional interaction between a human user (wearer) and a robotic platform. The system comprises a textile-based substrate incorporating smart materials, conductive threads, and a plurality of integrated sensors operable to acquire real-time physiological and biomechanical signals. One or more embedded microcontrollers operable to filter and interpret incoming signals, predict user intent, and generate control instructions for one or more robotic systems. The system may further include a bi-directional feedback mechanism comprising haptic actuators and smart material-based stiffness modulation components configured to respond to incoming signals from the robotic system and deliver tactile or structural cues to the user in real-time. The wearable interface may be integrated into a variety of garments.
Description
WEARABLE MOTION INTERFACE FOR CONTROLLING ROBOTIC SYSTEMS
Technical Field
[0001] The embodiments disclosed herein relate to wearable devices for health monitoring and motion control, and more particularly to integrated systems that incorporate smart materials, conductive textiles, and multimodal sensors for controlling robotic systems. This includes exoskeletons, collaborative robots, humanoids, and other robotics applications that benefit from real-time human input and feedback.
Introduction
[0002] Wearable technology has seen rapid advancement, driven by the demand for continuous physiological monitoring (e.g., heart rate, body temperature, or electrocardiograms) and motion tracking (e.g., step counting, posture correction). Despite notable progress, most conventional wearables provide only unidirectional data acquisition — often lacking robust, low-latency control signals for external systems. Moreover, these devices typically rely on rigid components or shallow sensor integration, which can reduce user comfort and limit the accuracy of sensing, particularly during dynamic activities.
[0003] In parallel, the field of robotics has made strides in autonomy, manipulation, and machine intelligence. However, many robotic applications still require precise control inputs from human operators, especially for tasks demanding fine motor skills or force feedback. Existing haptic feedback mechanisms are often bulky, slow, or incomplete, and fail to offer a seamless, intuitive user experience.
[0004] Therefore, there is an unmet need for a flexible, comfortable, and highly integrated wearable system that can both capture real-time physiological and motion signals and deliver immediate feedback or structural assistance to the user. Such a system would enable more intuitive and powerful human-robot interactions, opening possibilities in healthcare, industrial robotics, teleoperation, and other collaborative robotic applications.
Summary
[0005] According to some embodiments, the present disclosure provides a wearable interface system configured for bi-directional interaction between a human user and a robotic platform. The system comprises a textile-based substrate incorporating smart materials, conductive threads, and a plurality of integrated sensors operable to acquire real-time physiological and biomechanical signals.
[0006] The textile substrate may include multilayer fabrics embedded with smart material micro-strands and conductive fibers, forming a conformable and breathable architecture adapted for continuous wear. Sensor nodes may be strategically integrated to capture electromyographic (EMG) signals, electrocardiographic (ECG) activity, inertial motion data, and capacitive or resistive pressure inputs. One or more embedded microcontrollers, optionally comprising edge artificial intelligence (Al) processors, are operable to filter and interpret incoming signals, predict user intent, and generate control instructions for one or more robotic systems.
[0007] The system may further include a bi-directional feedback mechanism comprising haptic actuators and smart material-based stiffness modulation components. These are configured to respond to incoming signals from the robotic system and deliver tactile or structural cues to the user in real-time. In some embodiments, the system supports modular deployment using adhesive sensor patches, repositionable modules, and energy-harvesting elements adapted for off-grid or extended operation.
[0008] By enabling secure, low-latency, and context-aware communication, the system addresses limitations of existing wearable technologies and facilitates precise human-in-the-loop control across medical, industrial, and consumer robotic applications.
[0009] Other aspects and features will become apparent, to those ordinarily skilled in the art, upon review of the following description of some exemplary embodiments.
Brief Description of the Drawings
[0010] The drawings included herewith are for illustrating various examples of articles, methods, and apparatuses of the present specification. In the drawings:
[0011] FIG. 1 is a sectional diagram of a wearable interface, according to an embodiment.
[0012] FIG. 2 is a diagram of a wearable garment interface, according to an embodiment;
[0013] FIGS. 3A-3B are diagrams of the capture of physiological and motion signals from wearable interfaces transmitted as control commands to a robotic system;
[0014] FIG. 4 is diagram of a control feedback loop between a wearable interface and a robotic system, according to an embodiment;
[0015] FIG. 5 is a flow chart of integrating real-world and synthetic physiological sensor data to train and optimize a machine learning model, according to an embodiment;
[0016] FIG. 6 is a flow chart of a wearable interface signal pathway, according to an embodiment; and
[0017] FIG. 7 is a flow chart of bi-directional data flow between a wearable interface and a robotic system, according to another embodiment.
Detailed Description
[0018] Various apparatuses or processes will be described below to provide an example of each claimed embodiment. No embodiment described below limits any claimed embodiment and any claimed embodiment may cover processes or apparatuses that differ from those described below. The claimed embodiments are not limited to apparatuses or processes having all of the features of any one apparatus or process described below or to features common to multiple or all of the apparatuses described below.
[0019] One or more systems described herein may be implemented in computer programs executing on programmable computers, each comprising at least one processor, a data storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. For example, and without limitation, the programmable computer may be a programmable logic unit, a
mainframe computer, server, and personal computer, cloud based program or system, laptop, personal data assistance, cellular telephone, smartphone, or tablet device.
[0020] Each program is preferably implemented in a high level procedural or object oriented programming and/or scripting language to communicate with a computer system. However, the programs can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language. Each such computer program is preferably stored on a storage media or a device readable by a general or special purpose programmable computer for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein.
[0021] A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the present invention.
[0022] Further, although process steps, method steps, algorithms or the like may be described (in the disclosure and I or in the claims) in a sequential order, such processes, methods and algorithms may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described does not necessarily indicate a requirement that the steps be performed in that order. The steps of processes described herein may be performed in any order that is practical. Further, some steps may be performed simultaneously.
[0023] When a single device or article is described herein, it will be readily apparent that more than one device I article (whether or not they cooperate) may be used in place of a single device I article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device I article may be used in place of the more than one device or article.
[0024] The term “microcontroller” as used herein refers to computer processors (e.g., central processing units, graphics processing units), integrated circuits, systems-on-a- chip, including associated hardware and software, being configured to execute instructions, perform calculations and/or process signals/data as the case may be.
[0025] Referring to FIG. 1 , shown therein is a cross-sectional schematic illustration of a multi-layered wearable interface 100, according to an embodiment. The wearable interface is generally constructed as a bioelectronic mesh 100. The bioelectronic mesh 100 integrates a combination of advanced functional materials, embedded electronics, and sensory elements into a conformable textile platform, enabling robust physiological monitoring and real-time bidirectional communication with external robotic systems.
[0026] An innermost sensing layer 118 (closest to the wearer) includes one or more physiological sensors 102 configured to capture bio-signals such as electromyographic (EMG), electrocardiographic (EKG/ECG), and motion data (IMU, piezoresistive sensors). These sensors 102 may be arranged in any suitable arrangement on the sensing layer 118 depending on the target (wearer’s) anatomy or intended application. The sensors 102 are disposed on or embedded in a flexible printed circuit board (PCBs) material 104 (micro-wiring pathways, flex PCBs, 3D printed liquid ink, etc.), which provides signal routing pathways and mechanical compliance during user movement but also remain stretchable and comfortable during wear. According to some embodiments, the sensors 102 further include bioimpedance sensors and/or temperature sensors.
[0027] Adjacent to or integrated with the PCB 104 is one or more antennas 106, configured for wireless communication using Bluetooth, Wi-Fi, near-field communication (NFC), 5G or other protocols. The antenna 106 may also be adapted to receive inductive power for wireless charging of onboard electronics. The wearable interface 100 includes at least one microcontroller 120 operably coupled to the sensors 102 and antennas 106. The microcontroller 120 is configured for local signal processing and feature extraction using machine learning algorithms, generating motion-intent or health-state control signals and transmitting the control signals to a robotic system via a wireless communication interface. Additional microcontrollers 120 can be substituted in place of at least one of the physiological sensors (EMG, ECG/EKG) 102 and motion sensors (IMU, piezoresistive sensors) for real-time data filtering, local Al capabilities reducing latency by interpreting signals-on board, enabling immediate control adjustments without requiring cloud-based processing.
[0028] An outer textile substrate 108 is formed from an elastic, breathable, and stretchable fabric, serving both protective and functional roles. Interwoven within this
fabric 108 are conductive threads 110, for example, silver-coated nylon, carbon nanofibers, or graphene fibers, graphene-infused fibres or polymer-based conductive fibres, which act as electrical interconnects and sensing electrodes. Embedded in parallel to the conductive threads are smart materials such as electroactive polymers, dielectric elastomers, and shape memory alloys micro-strands 112. The smart material elements 112 dynamically adjust their stiffness and/or shape in response to electrical current or thermal input, allowing the wearable interface 100 to provide variable resistance, dynamic posture support, or real-time corrective feedback based on sensor 102 input or robotic response.
[0029] Positioned throughout the mesh 100 are haptic actuator wires 114, which may be connected to different actuation mechanisms to deliver localized tactile cues. The haptic elements 114 allow the interface 100 to communicate alerts, guidance, or feedback signals directly to the user/wearer. In addition, piezoelectric modules 116 are integrated within the fabric 108 for dual purposes: the piezoelectric modules are configured as biomechanical signal generators when exposed to user movement (e.g., gait cycles or joint motion), or as responsive elements that deliver haptic feedback through mechanical deformation or as pressure sensors. Together, these components form a cohesive, adaptive textile system capable of interpreting user intent, transmitting control commands, and providing tactile or structural feedback as part of a closed-loop wearable-robotic interface 100.
[0030] Referring to FIG. 2, shown therein is an exemplary wearable garment 200 constructed from the bioelectronic mesh, illustrating its integration into various form factors. In the embodiment shown in FIG. 2 the wearable 200 is a t-shirt 200 fabricated from the bioelectronic mesh, demonstrating the modular and scalable nature of the wearable interface system 100 for broader anatomical coverage. The wearable 200 incorporates wireless communication module 202, which may support Bluetooth, Wi-Fi, NFC, or 5G connectivity, enabling low-latency communication with robotic systems, companion applications, or cloud-based health platforms.
[0031] The wearable 200 also includes haptic actuator wires 114 embedded throughout the fabric, configured to deliver tactile feedback in response to external
signals, such as robotic status changes or guided motion cues. A dedicated region 204 serves as a placeholder for one or more embedded processing modules, such as microcontrollers, edge Al processors, and motor drivers. This modular region 204 allows for easy integration of computing and control hardware, enabling on-device signal processing, machine-learning inference, and local actuation control.
[0032] The garment further integrates physiological sensor regions 102a, 102b, having EMG and EKG/ECG pads respectively. These sensors 102a, 102b are strategically placed to monitor muscle activity, cardiac signals, and related bio-signals during movement or stationary phases when the wearable 200 is worn. Additionally, the wearable 200 is designed to accommodate interaction with external users, for example, enabling physiological measurements when a person places their hand or other body part against the fabric surface. For this purpose, the garment may include additional externally-facing contact-based sensors such as optical heart rate sensors, blood oxygen (SpO2) sensors, or skin temperature sensors that are embedded in specific areas corresponding to the stimulus measured. These sensors are activated upon skin contact and are capable of capturing metrics such as heart rate, blood oxygen saturation, or galvanic skin response. Depending on the placement and the type of sensors included, the garment can detect and interpret biometric data from individuals other than the primary wearer, thereby enabling broader use cases in remote care, triage, or physical interaction scenarios.
[0033] The bioelectronic mesh 100 may also be provided as a variety of other wearable garments, accessories and form factors to support distinct sensing and control applications. These include garments such as t-shirts, sleeves, and wraps that cover larger muscle groups, enabling broader motion detection, force estimation, and posture analysis. Patches and adhesive strips may be affixed to localized anatomical areas such as the forearm, chest, or calf to enable targeted measurements and may include energyharvesting technologies such as thermoelectric or triboelectric generators for self- powered operation. Self-powered patches incorporating triboelectric or thermoelectric generators may enable extended operation in off-grid or resource-constrained scenarios. Additionally, wearable accessories such as rings, bracelets, or insoles may be employed for tracking fine motor activity (e.g., finger gestures) or foot pressure distribution
particularly useful in humanoid robotics, rehabilitation, and assistive or hospitality-related applications.
[0034] Referring to FIGS. 3A-3B, shown therein are diagrams of a system 300 showing the interconnectivity and signal flow between a wearable bioelectronic garment 200 and a robotic system 302, according to an embodiment. The system 300 demonstrates how physiological and motion signals captured by the wearable are processed locally and wirelessly transmitted as control commands/signals to the robotic platform.
[0035] The wearable garment 200 and robotic system 302 are wirelessly paired via a communication channel 304 that may include one or more of Bluetooth, Wi-Fi, NFC, or 5G technologies, as described in FIGS. 1 and 2. During use, signals from embedded physiological sensors — such as EMG electrodes, EKG/ECG pads, and inertial measurement units — are acquired in real-time. The communication channel 304 provides for the live flow of intent-based command signals. These raw signals are digitized and filtered by local microcontrollers embedded in the garment. Integrated Al or machinelearning models analyze the signals to determine muscle activation thresholds, joint angles, or posture deviations, and subsequently generate motion-intent control commands. The control commands include both continuous movement instructions and discrete gesture-based triggers.
[0036] These commands are wirelessly transmitted from the garment to the robotic system 302 through secure communication protocols. In FIG. 3A, the extended sleeve 310 of garment 200 reflects a real-time action performed by the user (e.g., extending their right arm), while the robot’s corresponding actuator 306 mirrors that same movement. Similarly, in FIG. 3A, the robot 302 mimics the movement when the sleeve is lowered.
[0037] The robotic system 302 may incorporate its own signal processing modules that receive, validate, and apply the control data received from the wearable 200. The robotic control architecture may include PID controllers, model predictive control, or Al- based motion planning systems, with wearable-derived inputs treated as high-priority commands. Based on these inputs, the robot 302 adjusts joint torques or actuator trajectories accordingly.
[0038] Additionally, the robotic system 302 may return feedback signals to the wearable 200. These may include force/torque sensor data, joint angle verification, or safety status indicators. Upon receiving this feedback, the wearable 200 may actuate its own embedded feedback modules including haptic actuators (e.g., vibration motors, smart materials such as electroactive polymers, dielectric elastomers, or shape memory alloys) disposed at anatomical cue points to provide real-time cues to the user to guide motion or provide alerts, ensuring safe and synchronized operation. For example, smart material strands may be triggered to provide posture correction or resistive feedback if the robot 302 encounters an unexpected obstacle or deviation.
[0039] This closed-loop feedback and control architecture enhances shared autonomy, enabling collaborative operation modes between human users wearing the wearable 200 and robotic systems 302. The system 300 may also support multi-user coordination scenarios, where multiple wearable-equipped users interact with a common robotic platform 302. Secure encryption protocols may be used throughout the data exchange process to maintain safety, data integrity, and operational reliability across the wearable-robot ecosystem.
[0040] Referring to FIG. 4, shown therein is a diagram of a system integration 400 from a robot-centric perspective, highlighting the data flow between the wearable interface 200 and a robotic platform 402. The system 400 demonstrates how the wearable 200 serves as an upstream command source and downstream feedback receiver, forming a continuous bi-directional control loop.
[0041] In this embodiment, the wearable 200 and robotic system 402 are wirelessly paired using embedded communication modules capable of Bluetooth, Wi-Fi, NFC, or 5G protocols. Upon detecting physiological activity such as EMG signals or posture deviation, the wearable 200 transmits control commands 404 to the robotic system 402. These commands 404 may include robotic arm motion vectors, grip force parameters, or camera positioning instructions, derived from the user’s intent as received by the sensors in the wearable 200.
[0042] Conversely, the robotic system 402 provides real-time feedback 406 to the wearable 200. This feedback 406 may include sensor-derived information such as joint
torque, load status, proximity alerts, or safety triggers. The wearable 200 interprets these signals and activates one or more embedded devices to communicate information back to the user. For example, localized vibration cues or smart material induced textile tensioning may guide the user’s movement or posture in response to the robot’s environment or operational state.
[0043] The robotic system’s 402 control architecture may employ advanced algorithms including PID loops, model predictive control, or learning-based motion planners. Inverse kinematics modules translate wearable-derived signals into joint-level commands, ensuring the robot mirrors human intent in real time. Filtering and contextual validation steps further safeguard against errant inputs or unsafe conditions.
[0044] This architecture enables shared autonomy and hybrid control paradigms, where human input augments robotic intelligence. In team-based scenarios, multiple users equipped with such wearables 200 may interface with a robotic system 402 collaboratively, ensuring coordinated motion and balanced load distribution. Redundant safety mechanisms and encrypted communication protocols ensure robust, fail-safe operation across the human-robot interface.
[0045] Referring to FIG. 5, shown therein is a block diagram 500 that illustrates an Al training and optimization process 500 enabled by the combined use of real-world data and simulated environments from both the wearable garment 200 and robotic systems 302, 402, according to an embodiment. The optimization process facilitates the continuous refinement of embedded Al models used to interpret physiological signals and issue robotic control commands, ultimately enhancing system performance and user experience.
[0046] In a data stream, real-world data is generated directly from the wearable device 502 during normal operation. This includes raw sensor signals from EMG, EKG/ECG, IMlls, capacitive touch, and other modalities. The data is subsequently extracted, filtered, and annotated 506 through pre-processing pipelines that ensure its usability and quality for Al model training. Once processed, the annotated real-world data 510 is compiled into training datasets used to improve motion-intent recognition, healthstate detection, and robotic coordination.
[0047] In parallel, a simulated environment is constructed using a digital twin 504 of the wearable system and its integrated sensors. This virtual model enables the generation of synthetic sensor data through controlled simulations that mimic physical interactions, user movements, and varying operational conditions. These simulations may incorporate generative machine learning approaches such as Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), and Diffusion Models to produce diverse and realistic synthetic data 508. Once labeled, this annotated synthetic data 512 serves as an independent or complementary training resource.
[0048] Both real-world and synthetic datasets 510, 512 may be combined 514 using Sim2Real alignment techniques, ensuring cross-domain consistency and enhancing generalization. This fusion process may involve domain adaptation methods, statistical harmonization, or latent space matching to bridge the gap between physical and virtual datasets.
[0049] The integrated dataset 514 is then used for Al model training and refinement through Transfer Learning 516. Techniques such as CNN-LSTM hybrids, Transformerbased sequence modeling, and fine-tuning are employed to adapt pre-trained models to the specific signal distributions and task requirements of the wearable-robot system.
[0050] Following training, the models undergo optimization and validation 518. This includes procedures such as hyperparameter tuning, Bayesian optimization, and performance benchmarking under variable conditions. The result is an adaptive Al system capable of real-time intent inference and responsive robotic actuation, continuously learning from both simulated and real-world interactions.
[0051] Referring to FIG. 6, shown therein is a block diagram illustrating the signal pathway 600 and real-time processing architecture used to provide user feedback in response to actions detected by the wearable garment 200. The process 600 begins when a user initiates a motion/action or physiological response 602. In response to the user action 602, sensors integrated into the wearable 200 — such as EMG and EKG/ECG electrodes 604, inertial measurement units (IMlls) 606, temperature sensors 608, and piezoresistive pressure sensors 610 — generate multimodal data that captures the user’s physiological state and biomechanical movement.
[0052] This raw sensor data is transmitted to and interpreted by an embedded microcontroller 612 or an external cloud-based processing unit. The system then performs contextual analysis by aggregating, filtering, and aligning sensor streams in real time 614. Signal fusion and interpretation modules apply algorithms that combine data from heterogeneous sources to extract actionable insights. These techniques may include statistical fusion, time-synchronized filtering, pattern recognition, and event correlation frameworks.
[0053] The output of the fusion process enables the system to detect and classify user states and events such as posture deviation, exertion thresholds, and significant motion gestures 616. These events may trigger feedback mechanisms embedded in the wearable, including haptic cues, smart material induced stiffness adjustments, or other context-specific alerts. The architecture supports both threshold-based detection and event-driven interrupt protocols, ensuring low-latency responsiveness to user behavior while maintaining robust contextual awareness.
[0054] This processing pipeline 600 enhances the intelligence and responsiveness of the wearable system, allowing it to operate autonomously or in tandem with robotic systems to close the feedback loop between sensed user actions and real-time actuation or guidance.
[0055] Referring to FIG. 7, shown therein is a diagram illustrating the bi-directional data flow 700 between the wearable interface/garment 100, 200 and the robotic system 302, 402, emphasizing real-time signal transmission, distributed Al processing, and closed-loop actuation. During continuous wear of the garment, an event trigger 702 is persistently generated in parallel with the real-time streaming of filtered physiological and motion data 704 from the wearable's onboard sensors and processing units.
[0056] Both the event trigger 702 and streaming data 704 are transmitted simultaneously through encrypted wireless communication channels to two destinations: the robotic system 710 and cloud server 708. Within the robotic system, the incoming data is processed locally by the onboard Al engine and computer 714 to enable low- latency control decisions. Concurrently, the cloud platform 712 performs additional
processing and analysis using larger computational resources for extended learning, anomaly detection, and performance optimization.
[0057] This dual-stream processing pipeline enables the robotic system and the cloud to jointly contribute to the feedback mechanisms provided to the user. Bi-directional wireless feedback 716 is continuously delivered to the wearable user 720, which may include haptic signals, garment stiffness modulation, or visual/audio cues, depending on the robot’s state and external conditions.
[0058] Simultaneously, the robot’s processed outputs are routed to its internal actuation and control system 718, enabling context-aware responses such as motion commands, force adjustments, or behavioural modifications. The Framework 700 thus provides a real-time feedback architecture that leverages simultaneous local and remote Al processing to ensure adaptive, intelligent, and responsive interaction between human operators and robotic systems.
[0059] According to various embodiments, the disclosed wearable health tracker and motion interface may be employed across multiple domains to facilitate precise, realtime control of robotic systems. The system interprets electromyographic (EMG) and motion signals generated by the user to deliver actionable control instructions to collaborative robots, surgical systems, warehouse automation platforms, and food preparation robots. In return, the robotic systems may provide haptic feedback, motion cues, or garment-level actuation to close the control loop.
[0060] According to an embodiment, a collaborative robot (cobot) is controlled via the user’s intent as interpreted from EMG signals and inertial motion data from the wearable. The cobot may be configured to assist with tasks such as maneuvering heavy components, performing alignments, or supporting the user during repetitive lifting operations. When threshold force limits are detected, the robotic system transmits force alerts or corrective alignment cues back to the wearable. In response, the garment actuates localized haptic modules or dynamically adjusts smart materials’ stiffness to notify or assist the user.
[0061] According to another embodiment, the wearable 200 may be used in remote surgical procedures or telepresence operations. Subtle gestures and muscle contractions
are captured by the wearable interface and transmitted to a robotic surgical platform, enabling accurate manipulation of tools in real-time. In scenarios involving tissue resistance or abrupt tool feedback, the robotic system may trigger alerts which are then conveyed through haptic actuators or smart materials’ tensioning mechanisms embedded within the wearable, thereby maintaining the surgeon’s spatial and tactile awareness during delicate interventions.
[0062] According to another embodiment, the wearable 200 facilitates hands-free control of autonomous or semi-autonomous logistics robots. Warehouse operators equipped with the wearable may issue directional or retrieval commands via motion intent, enabling efficient item selection and transport. Feedback loops may alert the operator to completion of tasks, obstructions, or improper alignment via garment-based vibrations or stiffness adjustments.
[0063] According to another embodiment, the wearable 200 may be employed in industrial kitchens or food production environments where repetitive and precision-based tasks are automated. A robotic arm may be controlled by the user to perform stirring, slicing, or dispensing operations. As the robot encounters variations in viscosity, resistance, or load, this data is communicated back to the wearable and translated into tactile cues that help the user adapt movement instructions in real-time, ensuring consistent output and safe operation.
[0064] These illustrative embodiments demonstrate how the disclosed wearable 200 enables real-time, bi-directional interaction between human users and robotic systems across multiple industries, supporting intuitive control, situational awareness, and task optimization.
[0065] According to various embodiments, the integration of artificial intelligence (Al) into wearable-controlled robotic systems enhances system responsiveness, adaptability, prediction, and precision. The methodologies described herein are configured to transform multimodal sensor data from the wearable garment — such as that captured in the systems of FIGS. 1-3 into refined robotic control signals that adapt in real time to user input.
[0066] According to an embodiment, imitation learning enables the robotic system to replicate complex human motions based on sensor-rich data collected from the wearable interface, as illustrated in FIG. 3. In this configuration, EMG electrodes, inertial sensors, and capacitive touch sensors continuously record the user's movements during task performance. Behaviour cloning models are used to map these recorded trajectories directly to robotic motion control parameters. In more advanced variants, the system applies Inverse Reinforcement Learning (IRL) to uncover the user's underlying objectives or Generative Adversarial Imitation Learning (GAIL) to improve generalization in diverse operational contexts.
[0067] According to another embodiment, the Al system leverages temporal sensor data sequences, including those shown in the signal streaming pathways of FIG. 6, to forecast imminent user actions. Recurrent neural networks (RNNs), long short-term memory (LSTM) networks, and transformer-based architectures are trained to anticipate limb movement, gesture transitions, or intent shifts based on previous motion patterns. By predicting user actions in advance, the robotic system can prepare actuation pathways or alter tool trajectories, significantly reducing operational latency and improving flow during human-robot collaboration.
[0068] In an embodiment, reinforcement learning frameworks optimize robotic behaviours by learning from the environment and user interaction data transmitted through the wearable interface. Feedback loops — represented in FIG. 7 as bi-directional wireless feedback 716 — serve as real-time reward signals for tuning robotic actions such as grip strength, velocity, or trajectory. Wearable-based signals like smart materials - such as electroactive polymers, dielectric elastomers, or shape memory alloys - tensioning or localized haptic cues can guide users to optimal behavior, while the robot adapts via continuous reinforcement updates.
[0069] According to an embodiment, the system integrates signals from multiple sensor modalities, such as those depicted in FIGS. 1 and 2, to generate robust control outputs. Data from EMG electrodes, IMUs, piezoresistive layers, and temperature sensors are aggregated and synchronized using deep learning fusion models that may include convolutional neural networks (CNNs), attention-based transformers, or
probabilistic graphical models like hidden Markov models (HMMs). This contextual fusion enables reliable motion tracking and interpretation, even in noisy environments or during overlapping movements.
[0070] Personalized Calibration via Few-Shot and Continual Learning To accommodate inter-user variability, few-shot learning algorithms enable the wearable to adapt to a new user’s biomechanical signature with minimal training data, as could be derived from early usage phases shown in FIG. 5's real-world data stream 502. Continual learning systems are implemented to allow the Al models to incrementally update and retain learned behaviors over time.
[0071] In various embodiments, semantic models are trained to infer higher-level user intentions from sensor input sequences. Using datasets derived from both annotated real-world and simulated interactions — as illustrated in FIG. 5 — Al classifiers can interpret motion cues to identify task-level goals such as lifting, gesturing, or halting. This enables the robotic system to execute actions autonomously or in coordination with the user’s predicted intent, enhancing the efficiency of collaborative workflows.
[0072] According to another embodiment, optimized Al models are deployed directly on embedded processors within the wearable interface or its companion controller, such as the module placements shown in FIG. 2. Techniques like model quantization, knowledge distillation, and parameter pruning ensure efficient inference even under resource constraints. These edge-AI capabilities support low-latency response and offline functionality, enabling the wearable-robot system to operate independently of cloud infrastructure.
[0073] According to various embodiments, in healthcare applications, sensor data captured from the wearable — e.g., posture deviations or EMG fatigue patterns — can be cross-referenced with electronic health records (EHRs), imaging scans, or genomic data to create a holistic patient profile. Al-driven integration supports use cases such as early detection of neurodegenerative disorders, rehabilitation protocol optimization, and remote health monitoring. FIG. 6’s sensor fusion and classification blocks 614 reflect the underlying infrastructure that can support these broader medical analytics.
[0074] According to an embodiment, a digital twin of the wearable device, as outlined in FIG. 5’s synthetic data generation stream 508, is created to simulate sensor data under variable operational conditions. These simulations model real-world phenomena — such as stress, heat, or mechanical deformation — using differentiable physics engines. Synthetic data generated from the digital twin is then aligned with physical data using Sim2Real techniques, including generative adversarial networks (GANs), variational autoencoders (VAEs), and latent-space harmonization methods. The result is a robust Al training set that enhances system performance and generalizability across diverse users and environments.
[0075] Collectively, these Al-driven methodologies substantially elevate the functional intelligence of the wearable robotic system, enabling highly adaptive, predictive, and personalized interactions in both industrial and clinical domains.
[0076] While the above description provides examples of one or more apparatus, methods, or systems, it will be appreciated that other apparatus, methods, or systems may be within the scope of the claims as interpreted by one of skill in the art.
Claims
1 . A wearable interface comprising: a flexible textile substrate comprising: conductive threads configured to transmit electrical signals; and smart material strands configured for altering mechanical stiffness upon electrical or thermal stimulation; a plurality of embedded sensors integrated into the textile substrate, the sensors configured to sense one or more of: electromyographic (EMG) signals, electrocardiographic (ECG) signals, inertial motion data, and capacitive or resistive pressure input; at least one microcontroller operatively coupled to the embedded sensors, the microcontroller configured to: perform local signal processing and feature extraction using machine learning algorithms, generate motion-intent or health-state control signals, and transmit the control signals to a robotic system via a wireless communication interface;
a feedback module integrated within the textile substrate and configured to provide tactile and/or haptic feedback to a user based on status signals received from the robotic system, the feedback module comprising one or more of: haptic actuators, and actuated smart material strands.
2. The wearable interface of claim 1 , wherein the smart material strands are configured to vary stiffness dynamically in response to a magnitude of EMG signal inputs.
3. The wearable interface of claim 1 , wherein the sensors are arranged in modular, detachable patches adapted for repositioning on different regions of a user’s body.
4. The wearable interface of claim 1 , wherein the microcontroller includes an edge Al processor configured to execute a pre-trained intent recognition model.
5. The wearable interface of claim 1 , wherein the wireless communication interface comprises one or more of Bluetooth Low Energy (BLE), Wi-Fi, Ultra- Wideband (UWB), or 5G protocols.
6. The wearable interface of claim 1 , wherein the haptic actuators include vibration motors disposed at anatomical cue points to guide motion or provide alerts.
7. The wearable interface of claim 1 , wherein the sensors further comprise bioimpedance sensors or temperature sensors.
8. The wearable interface of claim 1 , wherein at least one sensor is powered by a triboelectric or a thermoelectric generator integrated into the wearable interface.
9. A method for controlling a robotic system using a wearable interface, comprising
acquiring physiological and motion data from a user by a plurality of sensors integrated within the wearable interface; processing the physiological and motion data in real-time by a microcontroller embedded in the wearable interface to determine user intent; wirelessly transmitting control commands based on the user intent to a robotic system; receiving feedback signals from the robotic system; and actuating one or more haptic or smart materials in the wearable interface to provide feedback to the user in response to the received feedback signals.
10. The method of claim 9, further comprising a calibration step that adjusts machine learning thresholds for motion intent based on initial baseline readings from the user.
11. The method of claim 9, wherein the control commands include both continuous movement instructions and discrete gesture-based triggers.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463633032P | 2024-04-11 | 2024-04-11 | |
| US63/633,032 | 2024-04-11 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025213278A1 true WO2025213278A1 (en) | 2025-10-16 |
Family
ID=97349174
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CA2025/050537 Pending WO2025213279A1 (en) | 2024-04-11 | 2025-04-11 | Robotic structural sensing and protective systems and methods |
| PCT/CA2025/050536 Pending WO2025213278A1 (en) | 2024-04-11 | 2025-04-11 | Wearable motion interface for controlling robotic systems |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CA2025/050537 Pending WO2025213279A1 (en) | 2024-04-11 | 2025-04-11 | Robotic structural sensing and protective systems and methods |
Country Status (1)
| Country | Link |
|---|---|
| WO (2) | WO2025213279A1 (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180361566A1 (en) * | 2013-03-15 | 2018-12-20 | Sri International | Electrolaminate Clutches for an Exosuit System |
| US20200055195A1 (en) * | 2017-05-03 | 2020-02-20 | Taiga Robotics Corp. | Systems and Methods for Remotely Controlling a Robotic Device |
| US20200163621A1 (en) * | 2013-09-17 | 2020-05-28 | Medibotics Llc | Smart Clothing with Inertial, Strain, and Electromyographic Sensors for Human Motion Capture |
| US20200393905A1 (en) * | 2017-09-28 | 2020-12-17 | John J. Daniels | Wearable Electronic Haptic Feedback System for VR/AR and Gaming |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8224485B2 (en) * | 2006-05-24 | 2012-07-17 | Titan Medical Inc. | Snaking robotic arm with movable shapers |
| CN203804999U (en) * | 2014-03-20 | 2014-09-03 | 西北工业大学 | Shape memory alloy spring driven flexible mechanical arm |
| US20230158688A1 (en) * | 2021-11-24 | 2023-05-25 | University Of Southern California | Soft robotics, autonomous, space inspection, crawling robot |
-
2025
- 2025-04-11 WO PCT/CA2025/050537 patent/WO2025213279A1/en active Pending
- 2025-04-11 WO PCT/CA2025/050536 patent/WO2025213278A1/en active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180361566A1 (en) * | 2013-03-15 | 2018-12-20 | Sri International | Electrolaminate Clutches for an Exosuit System |
| US20200163621A1 (en) * | 2013-09-17 | 2020-05-28 | Medibotics Llc | Smart Clothing with Inertial, Strain, and Electromyographic Sensors for Human Motion Capture |
| US20200055195A1 (en) * | 2017-05-03 | 2020-02-20 | Taiga Robotics Corp. | Systems and Methods for Remotely Controlling a Robotic Device |
| US20200393905A1 (en) * | 2017-09-28 | 2020-12-17 | John J. Daniels | Wearable Electronic Haptic Feedback System for VR/AR and Gaming |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025213279A1 (en) | 2025-10-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Yang et al. | Homecare robotic systems for healthcare 4.0: Visions and enabling technologies | |
| Xiong et al. | Intuitive human-robot-environment interaction with EMG signals: A review | |
| Al-Yacoub et al. | Effective human-robot collaboration through wearable sensors | |
| Simonetti et al. | Multimodal adaptive interfaces for 3D robot-mediated upper limb neuro-rehabilitation: An overview of bio-cooperative systems | |
| US20250242150A1 (en) | Addressable serial electrode arrays for neurostimulation and/or recording applications and wearable patch system with on-board motion sensing and magnetically attached disposable for rehabilitation and physical therapy applications | |
| Belal et al. | Deep learning approaches for enhanced lower-limb exoskeleton control: A review | |
| Zhang et al. | Integrating intention-based systems in human-robot interaction: a scoping review of sensors, algorithms, and trust | |
| Boru et al. | Novel technique for control of industrial robots with wearable and contactless technologies | |
| Li et al. | A systematic review on hand exoskeletons from the mechatronics aspect | |
| Belcamino et al. | A systematic review on custom data gloves | |
| Das | Adaptive physical human-robot interaction (PHRI) with a robotic nursing assistant. | |
| WO2025213278A1 (en) | Wearable motion interface for controlling robotic systems | |
| Weisz et al. | A user interface for assistive grasping | |
| Simonetti et al. | Reprint of “multimodal adaptive interfaces for 3d robot-mediated upper limb neuro-rehabilitation: An overview of bio-cooperative systems” | |
| Cheng et al. | Developing a cyber-physical rehabilitation system for virtual interaction between patients and occupational therapists | |
| James et al. | Realtime hand landmark tracking to aid development of a prosthetic arm for reach and grasp motions | |
| Crnokić et al. | The Development of Assistive Robotics: A Comprehensive Analysis Integrating Machine Learning, Robotic Vision, and Collaborative Human Assistive Robots | |
| Gao et al. | Better interaction experience: human-machine interface for soft robotic systems | |
| Kishor et al. | Tactile Intelligence: Integrating Artificial Intelligence and Haptics for Patient-Centric Smart Healthcare Systems | |
| Hernandez-Cuevas et al. | Neurophysiological closed-loop control for competitive multi-brain robot interaction | |
| Prabhakar | Soft is Safe: Human-Robot Interaction for Soft Robots | |
| Triwiyanto et al. | A review of 3D printing technology for the development of exoskeletons for upper limb rehabilitation | |
| Yin et al. | The Evolution of Wearables: A Survey on Trends, Challenges, and the Emerging Impact of Smart Rings | |
| Abbate et al. | Learning Hand State Estimation for a Light Exoskeleton | |
| Rocon et al. | Introduction: Exoskeletons in rehabilitation robotics |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25785365 Country of ref document: EP Kind code of ref document: A1 |