[go: up one dir, main page]

WO2025122069A1 - Procédé mis en œuvre par ordinateur pour commander le fonctionnement d'une prothèse robotique - Google Patents

Procédé mis en œuvre par ordinateur pour commander le fonctionnement d'une prothèse robotique Download PDF

Info

Publication number
WO2025122069A1
WO2025122069A1 PCT/SV2024/000001 SV2024000001W WO2025122069A1 WO 2025122069 A1 WO2025122069 A1 WO 2025122069A1 SV 2024000001 W SV2024000001 W SV 2024000001W WO 2025122069 A1 WO2025122069 A1 WO 2025122069A1
Authority
WO
WIPO (PCT)
Prior art keywords
computer
controlling
user
implemented method
robotic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/SV2024/000001
Other languages
English (en)
Spanish (es)
Inventor
Santos Moises RECINOS MONTOYA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2025122069A1 publication Critical patent/WO2025122069A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/25Bioelectric electrodes therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2/70Operating or control means electrical
    • A61F2/72Bioelectric control, e.g. myoelectric
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models

Definitions

  • a computer-implemented method for controlling the operation of a robotic prosthesis using an algorithm allows a subject to control the prosthesis using electrical impulses captured by EMG and EEG sensors.
  • the information obtained is processed by artificial intelligence, emulating the functions of an upper limb.
  • Document MX2017009980A describes an articulated arm prosthesis mechanism controlled by toe movements, using gears and pulleys.
  • this invention is complex, as each toe must perform a different action, which involves a considerable learning curve, and for people without lower limbs (this system could not be controlled).
  • the present invention allows operation through brain signals and does not depend on specific movements of other limbs.
  • patent document CN103892945B advocates a myographic joint system, without EMG sensors, however, this invention could not work if the patient does not have an arm, it also lacks safety sensors for the protection of the user and the prosthesis, and requires constant maintenance due to its gear system.
  • the present invention overcomes these problems by integrating protection and self-diagnosis sensors that alert the user to failures and allow adjustments without the need for equipment.
  • Document US10980466B2 discusses a system based on EEG sensors to control a prosthesis, but it does not consider possible errors in the precision of patient movements. This is because it uses only one type of sensor and sweat, when in contact with the skin after 2 hours, tends to generate erroneous signals captured from delta and alpha waves.
  • the computer-implemented method that is intended to be protected controls the operation of a robotic prosthesis because it uses not only EEG sensors but also EMG sensors, in addition to a complete list of anti-hostile environment sensors for the well-being of both the patient and a set of algorithms and robotic system.
  • the computer-implemented method disclosed in the present invention for controlling the operation of a robotic prosthesis seeks to reintegrate amputee citizens into the workplace and society.
  • the computer-implemented method for controlling the operation of a robotic prosthesis controls said prosthesis using EMG and EEG bioelectrical signals from a subject.
  • an initial configuration is performed using a computer, a customized algorithm based on the user's profile.
  • This algorithm is then loaded into a nanocomputer located inside the robotic prosthesis.
  • the EMG and EEG sensors placed on the user send signals to the nanocomputer via specific communication media.
  • artificial intelligence algorithms are used to process these signals. interpreting the user's movement intentions, which allows servomotors in the prosthesis to be activated to perform the desired actions.
  • the method can integrate various additional sensors, for example, temperature, humidity, and heart rate sensors, as well as an inertial measurement unit (IMU), which aids in the precise detection of position and movement.
  • IMU inertial measurement unit
  • a virtual assistant can also be available to interact with the user and facilitate control of the robotic prosthesis, while an internal diagnostic area allows the prosthesis's status and energy level to be reviewed.
  • the entire method is managed through a specially designed computer program loaded onto the nanocomputer, which coordinates and executes each of the functions to offer an optimized and adaptable user experience for a subject.
  • the computer-implemented method for controlling the operation of a robotic prosthesis which is the subject of protection in this document, basically consists of the following steps:
  • step (b) load the algorithm obtained in step (a) into the nano-computer
  • step (d) process the signals from step (c) with algorithms and artificial intelligence;
  • step (e) The processed step (d) signals activate the servo motors.
  • the present invention provides a computer-implemented method for controlling the operation of a robotic prosthesis, wherein only EEG signals from sensors placed on a user can be received.
  • the computer-implemented method of the present invention involves additional signals received from sensors such as humidity, temperature, heart rate, and master sensors.
  • sensors such as humidity, temperature, heart rate, and master sensors.
  • IMU inertial measurement unit
  • Another preferred embodiment of the computer-implemented method for controlling the operation of a robotic prosthesis comprises including a virtual assistant, a fault diagnosis system and energy charging.
  • each algorithm is unique and will differ for an adult, a young adult, or a child.
  • the algorithm is programmed in editable programming code, but once introduced into the nanocomputer, it cannot be modified.
  • Fig. 1 Front-top three-dimensional view of the robotic prosthesis.
  • Fig. 2 EMG signal operation graph.
  • Fig. 3 Explanatory diagram of the internal and external components of the nano-computer.
  • Fig. 4 Diagram of the method for controlling the operation of a robotic prosthesis.
  • Fig. 5 Flowchart of the computer program.
  • Fig. 6 Location of EMG and EEG sensors on a user.
  • Fig. 7 Brain waves identified in an electroencephalogram.
  • Figure 1 shows the robotic prosthesis from a top-front three-dimensional view, displaying at least one EMG sensor (1) that captures bioelectrical signals from the user's muscles.
  • the robotic prosthesis has a lower and upper housing (2).
  • the servomotor area (5) is also protected by a housing (7).
  • All the internal components described above that are part of the robotic prosthesis are embedded within a silicone protector (6) in the shape of a hand, which is very similar to human skin and is achieved by scanning the user's existing hand with a 3D scanner, thus obtaining a hand similar to the real one.
  • Below the silicone hand are the fingers and joints printed on a 3D printer.
  • the diagnostic screen (3) and an electrical branch (4) can also be seen.
  • Figure 1 shows at least one rechargeable battery (8) and also a solar panel (9), the latter as an alternative power supply. This entire set of devices constitutes the housing of the robotic prosthesis, and is responsible for its operation.
  • Figure 2 displays the EMG signal performance graph of the computer-implemented method for controlling the operation of a robotic prosthesis.
  • the y-axis of the Cartesian plane shows a maximum threshold of 0.8, and the x-axis of the Cartesian plane shows the different threshold activation times.
  • the algorithm loaded into the nanocomputer determines, using artificial intelligence, which signals are useful and which are not. For example, if the threshold reaches 0.2, the servomotor is activated with a 100-degree rotation.
  • the graph shows the different thresholds (0.2, 0.4, and 0.6) depending on the EMG signal received; the last signal observed corresponds to the gyroscope.
  • Figure 3 comprises an explanatory diagram of the internal and external components of the nano-computer (15), which comprises the main master sensor area (11), in addition to including EMG electromyography sensors, EEG electroencephalogram sensors and an ECG heart rate sensor. Also, the wireless connection area (10) can be seen, which receives signals from the EEG sensor. The wireless connection area also includes abilities to control smart devices, such as televisions, Smartphones, among others.
  • the inertial measurement unit or IMU (13) can be displayed, said IMU can be transformed to add a virtual assistant, microphone and speaker.
  • the IMU (13) measures and reports on the speed, orientation, gravitational forces of the robotic prosthesis.
  • the IMU (13) can be directly connected to a wireless connection or a wired connection, and thanks to the user's gestures, smart devices can be controlled. For example, to move a slide on a computer via Bluetooth, it is enough to link the robotic prosthesis to a computer or other device that has a wireless connection.
  • the robotic prosthesis makes use of the gyroscope and accelerometers of the IMU (13) to rotate said prosthesis or if the prosthesis is moving forward or backward, or is rotating.
  • the inertial measurement unit or IMU (13) can have a speaker and a microphone connected, so that the user of the prosthesis can communicate with the virtual assistant.
  • the ECG sensor is observed, which has the function of detecting the number of heartbeats per minute of the user.
  • the normal range for heart rate is 120 rpm, but if the ECG sensor detects that the heart rate has dropped to 100 rpm and then drops further to 60 rpm, or if the heart rate is above 140 rpm, it is determined that the person is minutes away from cardiac arrest, and the horn is turned on to alert people near the user, who is wearing the robotic prosthesis, that they need urgent medical attention. If the user's heart rate continues to drop, through the nano- computer (15) located in the robotic prosthesis, a signal can be sent to a cell phone of a relative of the user, but it is necessary to previously register said emergency phone number within the algorithm.
  • the IMU (13) has the capacity to connect to the Internet, and can be linked to a virtual assistant, and also thanks to the fact that a microphone and a speaker can be integrated into the IMU (13), these last devices will be activated by recognizing the voice of the user of the robotic prosthesis, thus the user will be able to make an internet query or play music, which is possible by virtue of the internal memory of the nano-computer.
  • the user of the robotic prosthesis can control the movements of said prosthesis by means of voice and make basic configurations. For example, if the user wants to see something on the diagnostic screen (3), such as the status of the algorithm or watch a video or video game, the user must press "play" on said screen.
  • the nano-computer has nano microcontrollers, for example the nano 33ble sense, and with it can execute one of the tasks mentioned above.
  • sensors (12) more sensors can be found such as "IR flame” that detects temperatures, and a humidity sensor, which are other master sensors (11) and that provide security to the user of the robotic prosthesis.
  • Figure 4 shows a diagram of the method for controlling the operation of a robotic prosthesis.
  • the diagram shows different areas, among which the sensors and connectors area (14), the nano-computer area (15), the artificial intelligence area (16), the EMG threshold signals and EEG waves area (17), and the servomotor activation area (18) can be mentioned.
  • the first sensors and connectors area (14) signals are constantly received from all the sensors, that is, for example, signals are received from the EMG, EEG and ECG sensors placed on the user, and said signals are then passed on to the nano-computer area (15) for diagnosis.
  • the nano-computer area (15) also receives signals from the wireless connection area (10), for example EEG signals.
  • the nano-computer area (15) sends the preprocessed signals to the artificial intelligence area (16) to process and classify them. Subsequently, when these signals are classified, they pass to the EMG threshold area and EEG waves (17), and the corresponding movement is carried out in the servomotors (18). If, for example, the user wants to open and close the fingers of the robotic prosthesis, and needs to hold a glass, the threshold for each finger will be 0.2 and each servomotor (18) will be activated with a rotation of 100 degrees.
  • Figure 5 reveals a flowchart of the computer program that executes the computer-implemented method for controlling the operation of a robotic prosthesis.
  • the flow is carried out as follows: the sensors collect readings (19), which are sent to the nanocomputer (20), the latter searches for matches by comparing the signals received from the sensors (19) and when it finds a match it sends it to the artificial intelligence area (21), which analyzes and classifies in more detail the signals pre-processed by the nano-computer (20). Subsequently, the artificial intelligence area (21) classifies based on the 4 programmed options (22) which option it should take.
  • option 1 is chosen, for this the EMG threshold 1 or the EEG wave 1 is activated, and an action 1 is executed in the servomotors (23), which is visualized by moving the robotic prosthesis at 100 degrees of rotation, that is, in order to hold large objects such as notebooks or apples.
  • option 4" is chosen, the EMG threshold 4 or EEG wave 4 is activated, and an action 4 is executed on the servomotors (23), resulting in a 300 degree movement, thus causing the thumb and little finger to touch each other, this option is suitable for holding small objects such as a pencil or a page with the robotic prosthesis.
  • Figure 6 shows the location of the EMG (1) and EEG (24) sensors in a user, which send signals to the nano-computer to control the operation of the robotic prosthesis.
  • Figure 7 shows the brain waves identified in an electroencephalogram.
  • the following wave types are displayed: delta, theta, alpha and beta, which normally occur at different times.
  • dream states that is, when the user is asleep.
  • delta, theta, alpha, and beta waves can also be replicated when the user is awake.
  • One advantage of the computer-implemented method for controlling the operation of a robotic prosthesis is that the EMG and EEG sensors work together to provide more precise movement of the robotic prosthesis, but they can also operate independently. For example, if the user has a partial upper limb, both the EMG and EEG sensors are connected. Conversely, if a user is missing a complete upper limb, the EMG sensor cannot be connected, but the EEG sensor will be connected and will send brainwave signals to the nanocomputer. Another advantage is that if the user walks for 3 to 5 hours, they may sweat, and the EMG sensor may receive inaccurate signals. However, by having two sensors, the EMG and EEG sensors, the EEG sensor validates the received signal, and the prosthesis functions correctly. It is important to note that if any sensor fails, surgery is not necessary to replace them, as is done with other existing technologies.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Pathology (AREA)
  • Surgery (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Vascular Medicine (AREA)
  • Cardiology (AREA)
  • Computational Linguistics (AREA)
  • Transplantation (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Algebra (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Prostheses (AREA)

Abstract

La présente invention concerne un procédé mis en œuvre par ordinateur pour commander le fonctionnement d'une prothèse robotique. Le procédé est réalisé par réception et traitement de signaux EMG et EEG reçus d'un utilisateur, lesdits signaux étant envoyés à un nano-ordinateur, une intelligence artificielle et des algorithmes spécifiques interprétant les signaux reçus et activant différents servomoteurs, permettant ainsi le mouvement de la prothèse robotique. L'invention prévoit en outre des capteurs supplémentaires pour protéger l'utilisateur, une unité de mesure inertielle (IMU), un assistant virtuel et un système de diagnostic de défaillances et de charge. Le procédé est géré au moyen d'un programme d'ordinateur qui est chargé sur le nano-ordinateur.
PCT/SV2024/000001 2023-12-05 2024-11-04 Procédé mis en œuvre par ordinateur pour commander le fonctionnement d'une prothèse robotique Pending WO2025122069A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SV2023006803 2023-12-05
SV2023006803 2023-12-05

Publications (1)

Publication Number Publication Date
WO2025122069A1 true WO2025122069A1 (fr) 2025-06-12

Family

ID=95980219

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SV2024/000001 Pending WO2025122069A1 (fr) 2023-12-05 2024-11-04 Procédé mis en œuvre par ordinateur pour commander le fonctionnement d'une prothèse robotique

Country Status (1)

Country Link
WO (1) WO2025122069A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8709097B2 (en) * 2005-09-01 2014-04-29 össur hf Actuator assembly for prosthetic or orthotic joint
US9050200B2 (en) * 2007-05-02 2015-06-09 University Of Florida Research Foundation, Inc. System and method for brain machine interface (BMI) control using reinforcement learning
US10318863B2 (en) * 2012-07-24 2019-06-11 Rehabilitation Institute Of Chicago Systems and methods for autoconfiguration of pattern-recognition controlled myoelectric prostheses
US20200265948A1 (en) * 2019-02-19 2020-08-20 Coapt Llc Electromyographic control systems and methods for the coaching of exoprosthetic users
US11202715B2 (en) * 2011-04-15 2021-12-21 The Johns Hopkins University Multi-modal neural interfacing for prosthetic devices
US20230086004A1 (en) * 2021-09-19 2023-03-23 Zhi Yang Artificial Intelligence Enabled Neuroprosthetic Hand

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8709097B2 (en) * 2005-09-01 2014-04-29 össur hf Actuator assembly for prosthetic or orthotic joint
US9050200B2 (en) * 2007-05-02 2015-06-09 University Of Florida Research Foundation, Inc. System and method for brain machine interface (BMI) control using reinforcement learning
US11202715B2 (en) * 2011-04-15 2021-12-21 The Johns Hopkins University Multi-modal neural interfacing for prosthetic devices
US10318863B2 (en) * 2012-07-24 2019-06-11 Rehabilitation Institute Of Chicago Systems and methods for autoconfiguration of pattern-recognition controlled myoelectric prostheses
US20200265948A1 (en) * 2019-02-19 2020-08-20 Coapt Llc Electromyographic control systems and methods for the coaching of exoprosthetic users
US20230086004A1 (en) * 2021-09-19 2023-03-23 Zhi Yang Artificial Intelligence Enabled Neuroprosthetic Hand

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ERICKA JANET RECHY-RAMIREZ, HUOSHENG HU: "Bio-signal based control in assistive robots: a survey", DIGITAL COMMUNICATIONS AND NETWORKS, vol. 1, no. 2, 17 March 2015 (2015-03-17), pages 85 - 101, XP055697721, ISSN: 2352-8648, DOI: 10.1016/j.dcan.2015.02.004 *

Similar Documents

Publication Publication Date Title
Hasan et al. Innovative developments in HCI and future trends
EP2077754B1 (fr) Système de contrainte d'articulation commutable
Fall et al. A multimodal adaptive wireless control interface for people with upper-body disabilities
Bertolotti et al. A wearable and modular inertial unit for measuring limb movements and balance control abilities
JP2018504162A (ja) 装置及び方法
US20130317648A1 (en) Biosleeve human-machine interface
Khadilkar et al. Android phone controlled voice, gesture and touch screen operated smart wheelchair
CN204819530U (zh) 仿生家政服务机器人
JP2018101137A (ja) 静電容量検知回路及びそれを用いてまぶたの位置を判定するための方法
WO2018165307A1 (fr) Atténuation de l'impact d'un visiocasque par l'intermédiaire de capteurs biométriques et d'un traitement de langage
Baldi et al. Design of a wearable interface for lightweight robotic arm for people with mobility impairments
KR20200087337A (ko) 헬스 케어 로봇 및 그 제어 방법.
Gaetani et al. A prosthetic limb managed by sensors-based electronic system: Experimental results on amputees
CN110908515A (zh) 基于腕部肌肉压力的手势识别方法和装置
WO2025122069A1 (fr) Procédé mis en œuvre par ordinateur pour commander le fonctionnement d'une prothèse robotique
Ghaffar et al. Assistive smart home environment using head gestures and EEG eye blink control schemes
Salvekar et al. Mind controlled robotic arm
Gokulraj et al. Transhumeral Sensory System to Control Robotic Arm
CN107530165B (zh) 可移动眼睛假体和相关系统及其方法
US20180267338A1 (en) Temperature-sensing ophthalmic device
Kapur et al. Wireless portable bot car controlled by brain signals
Samanta et al. Modeling and Designing of Gesture Control Robot
ES2561927B1 (es) Sistema multisensor para rehabilitación e interacción de personas con discapacidad
Rahman et al. A Neuro-Wave Controlled Wheelchair Combined with Gesture and Voice Control
Rafid et al. Development of a brain-computer interface (BCI) for person with disabilities to control their wheelchair using brain waves

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24901210

Country of ref document: EP

Kind code of ref document: A1