[go: up one dir, main page]

WO2025153392A1 - Systèmes et procédés de guidage d'intervention chirurgicale - Google Patents

Systèmes et procédés de guidage d'intervention chirurgicale

Info

Publication number
WO2025153392A1
WO2025153392A1 PCT/EP2025/050452 EP2025050452W WO2025153392A1 WO 2025153392 A1 WO2025153392 A1 WO 2025153392A1 EP 2025050452 W EP2025050452 W EP 2025050452W WO 2025153392 A1 WO2025153392 A1 WO 2025153392A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
artificial intelligence
surgical
output
intelligence assistant
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2025/050452
Other languages
English (en)
Inventor
Gene Edward Austin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Smith and Nephew Orthopaedics AG
Smith and Nephew Inc
Original Assignee
Smith and Nephew Orthopaedics AG
Smith and Nephew Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smith and Nephew Orthopaedics AG, Smith and Nephew Inc filed Critical Smith and Nephew Orthopaedics AG
Publication of WO2025153392A1 publication Critical patent/WO2025153392A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • A61B2034/104Modelling the effect of the tool, e.g. the effect of an implanted prosthesis or for predicting the effect of ablation or burring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots

Definitions

  • the present disclosure generally relates to methods, systems, and apparatuses for guiding surgical procedures such as, for example, computer-assisted surgical systems, as well as such procedures per se, and, in particular, to an artificial intelligence system for supporting such surgical procedures.
  • Surgery comprises complex and delicate processes that requires a deep understanding of the medical device instruments and implants used as well as adherence to step-by-step instructions, together with precision and accuracy in execution.
  • surgeons rely on specific surgical techniques provided by medical device manufacturers and can also use feedback from various imaging technologies such as cameras, x-rays and audio communications and sensor data to navigate and perform surgeries. Performing these steps from memory, or encountering patient specific complications, can lead to mistakes that increase the risk of adverse consequences for patients.
  • surgeons practicing at a community hospital may perform a specific surgical procedure relatively infrequently. The lack of regular exposure to the technique, instruments, and patient variability increases the risk of poor outcomes or worse such as, for example, surgical injury.
  • patella and patella tracking problems account for a substantive number of poor total knee arthroplasty (TKA) outcomes in patients. It is well-stablished that anterior knee pain after a TKA can be a typical post-surgery patellofemoral complication that may be attributed to issues with a native or implant patella.
  • TKA total knee arthroplasty
  • robot-assisted TKA or computer-assisted TKA may be used in preparing the femur and tibia such as, for example, optimizing implant placement, leg alignment, soft tissue balancing, range of motion assessment, etc.
  • patella mal-positioning can result in patellar maltracking and, ultimately, pain and other significant complications, the patella is currently not addressed in conventional computer-assisted TKA.
  • the prominent surgical technique for patellar replacement is a free-hand technique. For example, before making the patellar cut, the surgeon measures the native or pre-cut patellar thickness with a surgical caliper and then makes the cut.
  • a patellar cut must be planned and executed with an adequate cut thickness and an appropriate implant size, and a design must be selected for the patellar component. Additionally, a position and orientation of the patellar component must be selected in conjunction with femoral and tibial component positions and orientations. Furthermore, the biomechanics of the patellofemoral joint may be determined in an effort to restore the normal function of the knee joint. Given the complexity of the patellofemoral anatomy and biomechanics, there is a high risk of patellofemoral complications because surgeons are required to make these various planning decisions intraoperatively with minimal available information and tools.
  • FIG. 1 depicts an operating theatre including an illustrative computer-assisted surgical system (CASS) in accordance with one or more features of the present disclosure
  • CASS computer-assisted surgical system
  • FIG. 2 illustrates a view of a surgical computer and artificial intelligence assistant
  • FIG. 3 shows a view of further example of the surgical computer having the artificial intelligence system integrated
  • FIG. 1 there is shown a view 100 of a CASS 102 according to an example.
  • the CASS 102 is arranged to aid surgeons in performing orthopedic surgical procedures such as, for example, a knee arthroplasty (e.g., total knee arthroplasty (TKA)) or total hip arthroplasty (THA).
  • An effector platform 104 positions surgical tools relative to a patient during surgery.
  • the effector platform 104 may include a robotic arm 104A and/or an end effector 104B that holds surgical tools or instruments during their use.
  • Effector platform 104 can include a limb positioner 104C for positioning the patient’s limbs during surgery.
  • Resection equipment (not shown in FIG.
  • the CASS 102 comprises an optical tracking system 106 that uses one or more sensors to collect real-time position data that locates the patient’s anatomy and surgical instruments.
  • Any suitable tracking system can be used for tracking surgical objects and patient anatomy in the surgical theatre.
  • IR infrared
  • Such an optical tracking system 106 can use the EMR retro-reflected from any of the retro-reflectors to determine real-time position data that locates at least one, or both, of: the patient’s anatomy and surgical instruments.
  • a fourth marker array 122 can be placed on the limb positioner 104C to assist in determining the position of a respective distal actuator 124 for holding a limb.
  • a fifth marker array 126 can be placed on eye-wear 128 of a surgeon 130.
  • a sixth marker array 127 can be located on the operating table 104E.
  • an object or body part can comprise a set of markers arrays.
  • Such a set of marker arrays per object or body part can be used to improve or increase the accuracy with which at least one, or both, of: position and orientation can be determined.
  • a tissue navigation system (not shown in FIG. 1) provides the surgeon with intraoperative, real-time visualization for the patient’s bone, cartilage, muscle, nervous, and/or vascular tissues surrounding the surgical area.
  • the CASS 102 can also comprise a display 108 to provide graphical user interfaces (GUIs) that display images collected by the tissue navigation system as well other information relevant to the surgery to the surgeon 130 or other operating threatre staff.
  • GUIs graphical user interfaces
  • the display 108 can overlay image information collected from various modalities (e.g., CT, MRI, X-ray, fluorescent, ultrasound, etc.) collected pre-operatively or intraoperatively to give the surgeon various views of the patient’s anatomy as well as real-time conditions.
  • a surgical computer 150 provides control instructions to various components of the CASS 102, collects data from those components, and provides general processing for various data needed during surgery.
  • the surgeon 130 is shown as wearing the protective eye- wear or an augmented reality headset 128, which can also comprise or bear the respective marker 126.
  • the surgeon also has a microphone 132 and a set of wireless earphones 134 or the like.
  • retro-reflective entities described with reference to FIG. 1 above were marker arrays, examples are not limited thereto. Examples can be realised in which one or more, or all, of the marker arrays are markers instead,
  • the surgical computer 150 provides control instructions to various components of the CASS 100, collects data from those components, and provides general processing for various data needed during surgery.
  • the surgical computer 150 is also responsible to managing communications with an artificial intelligence assistant (AIA) 152, which will be described in greater detail with reference to FIGs. 2 to 5 taken jointly and severally.
  • AIA artificial intelligence assistant
  • the surgical computer 150 comprises software 154 for managing communications and data exchanges with the artificial intelligence assistant 152.
  • the software 154 is an examples of “surgical assistance surgery”. Examples can be realised in which the artificial intelligence assistant 152 is remote from the surgical computer 150, that is, the artificial intelligence assistant 152 is accessible via, for example, a network 204.
  • the surgical computer 150 comprises communication hardware and software 206, which is an example of communication circuitry.
  • the term “communication circuitry” is used herein refers to such a combination of hardware and software for realising networked communications.
  • the communication circuitry 206 communicates with the artificial intelligence assistant 152 via a respective input/output interface 208.
  • the surgical computer 150 comprises a further input output interface 210.
  • the further input/output interface 210 represents a collection or set of input output interfaces arranged for interfacing, or otherwise exchanging data with, respective entities of a set of peripherals 212.
  • the set of peripherals 212 can comprise one, or more than one, peripheral. In the example depicted in FIG. 2, the set of peripherals comprises 12 peripherals. Examples can be realised in which the set of peripherals 212 comprises at least one, or more, of the following taken jointly and severally in any and all permutations:
  • the software 154 of the surgical computer 150 is arranged to receive, via the communication circuitry 206 and input/output interface 208, the response 252.
  • the response comprises respective response data 254 that reflects the response to the LLM 248 to the query, in particular, to the data 250 contained within the query.
  • the software 154 is arranged to generate a set of output data output data 256.
  • the output data 256 comprises data suitable for processing by, or otherwise being output to, a respective set of peripherals of the set of peripherals 212.
  • the respective set of peripherals of the set of peripherals 212 can comprise one peripheral or multiple peripherals of the set of peripherals 212.
  • the output data 256 can be output to respective peripherals of the set of peripherals 212.
  • the set of peripherals 212 used to process the output data 256 will depend on the contents of the response 252, in particular, the content of the response data 254. Examples of the output data 256 being directed to respective peripherals of the set of peripherals 212 will be described below.
  • the data 254 of the response 252 may contain text based directions for performing a surgical action.
  • the text based directions are intended to be output to the surgeon. Outputting the text based directions to the surgeon can be realised using one or more than one modality.
  • the text based directions can be displayed on the display 230. Alternatively, or additionally, the text based directions can be converted to speech that is output via one or more than one audio device. Examples can be realised in which the text data can be displayed on, for example, a display of the augmented reality headset 218 or on a more generic display such as the above mentioned display 230, or converted to speech for output via the earphones 237.
  • the set of input data 240 can comprise data representing a query generated by the surgeon. That query may have been issued, for example, using the microphone 216.
  • the microphone 216 will generate audio or speech data (not shown).
  • the speech data can be converted to a format suitable for processing by, or submission to, the large language model 248. Examples can be realised in which that format is text. Alternatively, or additionally, that format can comprise a digital representation of the speech uttered by, for example, the surgeon.
  • FIG. 3 there is shown a view 300 of the surgical computer 150 having the artificial intelligence assistant 152 formed as an integral part thereof.
  • Reference numerals common to FIGs. 2 and 3 refer to the same entity and have the same operation. Therefore, the surgical computer 150 comprises the above described software 154, the communication circuitry 206, and the input output interface 210, as well as the artificial intelligence assistant 152 and associated software 242.
  • FIG 4 there is shown a view 400 of a pair of flowcharts 402a and 402b describing processing undertaken by the software 154 and the artificial intelligence system 152 respectively.
  • the set of input data 240 is received.
  • the set of input data 240 is processed, at 406, to create the query 238 in a format that is suitable for processing by the artificial intelligence assistant 152 and, more particularly, by the large language model 248. More specifically, the set of input data 240 is expressed in the form of data 250 that is suitable for processing by the LLM 248.
  • the LLM 248 is multi-modal such that the set of input data 240 can merely be copied to data 250 for the LLM 248.
  • the query 238 is output or otherwise transmitted or communicated to the artificial intelligence assistant 152 at 408.
  • the response 252, comprising the response data 254 is received by the surgical computer 152.
  • the software 154 of the surgical computer 152 processes the response 250, in particular the response data 254, at 412, to generate the set of output data 256.
  • the set of output data 256 may comprise uni-modal data intended to be output via a single peripheral of the set of peripherals 212 or multi-modal data intended to be output by a respective set, or subset, of the set of peripherals 212.
  • the set of output data 256 is output, via the communication circuitry 206 and input output interface 210, to a respective peripheral of the set of peripherals 212 or to a respective set or subset of the set of peripherals 212 at 414.
  • the functionality of the surgical computer 150, the surgical computer software 154, the artificial intelligence assistant 152 and the large language model 248 can be realised using machine instructions that can be processed by a machine comprising, or having access, to the instructions.
  • the machine can comprise a computer, processor, processor core, DSP, a special purpose processor implementing the instructions such as, for example, an LPGA or an ASIC, circuitry or other logic, a compiler, a translator, an interpreter or any other instruction processor.
  • Processing the instructions can comprise interpreting, executing, converting, translating or otherwise giving effect to the instructions.
  • the instructions can be stored using a machine readable medium, which is an example of machine-readable storage.
  • the machine-readable medium can store the instructions in a non-volatile, non-transient, manner or in a volatile, transient, manner.
  • the instructions can be arranged to give effect to any and all operations described herein taken jointly and severally in any and all permutations.
  • the instructions can be arranged to give effect to any and all of the operations, devices, systems, flowcharts, and methods described herein taken jointly and severally in any and all permutations.
  • the machine instructions can give effect to, or otherwise implement, the operations of the flowcharts, taken jointly and severally, depicted in, or described with reference to, figure 4, or any of the entities shown in any of the figures.
  • FIG. 5 shows a view 500 of machine instructions 502 stored using machine readable storage 504 for implementing the examples described herein.
  • the machine instructions 502 can be processed by, for example, a processor 506 or other processing entity, such as, for example, an interpreter, as indicated above.
  • machine instructions 510 to process the set of input data to generate the query 238 to be submitted to the artificial intelligence assistant 152;
  • machine instructions 514 to receive the response 252 from the artificial intelligence assistant 152; [0045] machine instructions 516 to generate the output data 256 to be output to the set of peripherals 212, or to a subset of the set of peripherals; and
  • machine instructions 524 to generate the response 252 to the received query 238
  • machine instructions 526 to output the response 252 to the surgical computer 150.
  • the conversion process can use a modality conversion table 258.
  • the modality conversion table 258 can be configured with mapping data 260 to contain mappings 262 to 266 between one or more data formats or modalities contained within the response data 254 output by the LLM 248 and one or more data formats or modalities of the data formats or modalities associated with the set, or a subset, of the set of peripherals 212.
  • the mappings 262 to 266 have been shown as comprising three mappings 262 to 266.
  • the surgical computer 152 outputs audio data as the output data 256 for output via the earphones 237. Consequently, the surgeon 130 might hear an audio response.
  • the audio response might be, for example, the following:
  • Fracture The insertion of the nail can cause a fracture in the bone. This can happen if the nail is too large or if too much force is used during the insertion process.
  • Nerve or blood vessel damage The insertion of the nail can damage nerves or blood vessels around the bone, leading to numbness, weakness, or even loss of sensation or function in the affected limb.
  • Non-union or delayed healing In some cases, the bone may not heal properly or may take longer than expected to heal after the nail is inserted.”
  • the response can also be displayed on the display 230.
  • the surgical computer 150 can be configured to construct the output data 256 in the form of multi-modality output data that outputs the response 252 in various different formats using different peripherals of the set of peripherals 212.
  • the response 252 might comprise the above text detailing points 1 to 5 as the response data 254.
  • the surgical computer 150 can convert the response data 254 into an audio format that is suitable for output via an audio output device such as, for example, the earphones 237 or other type of speaker, and into a visual for such as, for example, text or graphics suitable for display on the at least one, or both, of: the display 230 or augmented reality headset 218, or any other type of display.
  • an audio output device such as, for example, the earphones 237 or other type of speaker
  • a visual for such as, for example, text or graphics suitable for display on the at least one, or both, of: the display 230 or augmented reality headset 218, or any other type of display.
  • the mapping of the response data 254 into one or more output modality formats can be reflected by the above described modality conversion table 258.
  • the surgeon 130 may choose to continue the dialogue with a further query, whereupon the query-response cycle repeats. For instance, the surgeon might ask “What can I do to minimise embolism during nail insertion?”

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Primary Health Care (AREA)
  • Surgery (AREA)
  • Epidemiology (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Animal Behavior & Ethology (AREA)
  • Evolutionary Computation (AREA)
  • Veterinary Medicine (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Molecular Biology (AREA)
  • Robotics (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Urology & Nephrology (AREA)
  • Prostheses (AREA)

Abstract

Des exemples concernent des appareils pour guider une intervention chirurgicale. Un appareil donné à titre d'exemple comprend : une interface de communication pour communiquer avec un assistant à intelligence artificielle, l'interface de communication étant conçue i. pour recevoir des données d'entrée pour l'assistant à intelligence artificielle, et ii. pour recevoir des données de sortie de l'assistant à intelligence artificielle ; et des circuits d'assistance chirurgicale : i. pour générer les données d'entrée pour l'assistant à intelligence artificielle à partir de données associées à au moins un périphérique de modalité d'entrée, et ii. pour traiter les données de sortie provenant de l'assistant à intelligence artificielle afin de générer des données de sortie associées à au moins un périphérique de modalité de sortie.
PCT/EP2025/050452 2024-01-15 2025-01-09 Systèmes et procédés de guidage d'intervention chirurgicale Pending WO2025153392A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463620907P 2024-01-15 2024-01-15
US63/620,907 2024-01-15

Publications (1)

Publication Number Publication Date
WO2025153392A1 true WO2025153392A1 (fr) 2025-07-24

Family

ID=94382452

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2025/050452 Pending WO2025153392A1 (fr) 2024-01-15 2025-01-09 Systèmes et procédés de guidage d'intervention chirurgicale

Country Status (1)

Country Link
WO (1) WO2025153392A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017220788A1 (fr) * 2016-06-23 2017-12-28 Siemens Healthcare Gmbh Système et procédé pour salles d'opération cognitives basées sur des agents artificiels
US20210386489A1 (en) * 2018-10-12 2021-12-16 Sony Corporation Surgical support system, data processing apparatus and method
US20230352133A1 (en) * 2020-06-08 2023-11-02 Activ Surgical, Inc. Systems and methods for processing medical data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017220788A1 (fr) * 2016-06-23 2017-12-28 Siemens Healthcare Gmbh Système et procédé pour salles d'opération cognitives basées sur des agents artificiels
US20210386489A1 (en) * 2018-10-12 2021-12-16 Sony Corporation Surgical support system, data processing apparatus and method
US20230352133A1 (en) * 2020-06-08 2023-11-02 Activ Surgical, Inc. Systems and methods for processing medical data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SAI VEMPRALA ET AL: "ChatGPT for Robotics: Design Principles and Model Abilities", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 19 July 2023 (2023-07-19), XP091567766 *
SEENIVASAN LALITHKUMAR ET AL: "SurgicalGPT: End-to-End Language-Vision GPT for Visual Question Answering in Surgery", 1 October 2023, MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2023; [LECTURE NOTES IN COMPUTER SCIENCE; LECT.NOTES COMPUTER], SPRINGER NATURE SWITZERLAND, CHAM, PAGE(S) 281 - 290, ISBN: 978-3-031-43995-7, ISSN: 0302-9743, XP047671644 *

Similar Documents

Publication Publication Date Title
US12193758B2 (en) Surgical system having assisted navigation
AU2005237479B2 (en) Computer-aided methods for shoulder arthroplasty
CN101711127B (zh) 使用捕获的关节运动信息的植入规划
US20070073136A1 (en) Bone milling with image guided surgery
US20070016008A1 (en) Selective gesturing input to a surgical navigation system
EP1697874B1 (fr) Appareil de remplacement d'un genou assisté par ordinateur
US20050203384A1 (en) Computer assisted system and method for minimal invasive hip, uni knee and total knee replacement
US20050267353A1 (en) Computer-assisted knee replacement apparatus and method
US20070233156A1 (en) Surgical instrument
US20060200025A1 (en) Systems, methods, and apparatus for automatic software flow using instrument detection during computer-aided surgery
US20050159759A1 (en) Systems and methods for performing minimally invasive incisions
JP2005137904A (ja) 手術中にディジタル化される標認点に対して二次元画像データを位置合わせするシステムおよび方法
US11980549B2 (en) Robotic shoulder fracture management
US20230240759A1 (en) Modular and depth-sensing surgical handpiece
WO2025153392A1 (fr) Systèmes et procédés de guidage d'intervention chirurgicale
WO2024254421A1 (fr) Ensembles marqueurs de repère pour systèmes de navigation chirurgicale
WO2024092178A1 (fr) Guide de coupe de navigation adapté au patient
US20250302538A1 (en) Virtual alignment of patient anatomy
US20240238101A1 (en) Device and method of using the device for assessing range-of-motion during hip resurfacing procedures
WO2025195657A1 (fr) Systèmes, dispositifs et procédés de résection osseuse naviguée
JP2025119597A (ja) コンピュータ支援骨盤外科手術ナビゲーション
Marcacci et al. Computer-Aided Surgery in Orthopaedics
Desai Medical Robotics Prof. Jaydev P. Desai

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25700846

Country of ref document: EP

Kind code of ref document: A1