[go: up one dir, main page]

WO2021054659A2 - Dispositif et procédé de navigation chirurgicale - Google Patents

Dispositif et procédé de navigation chirurgicale Download PDF

Info

Publication number
WO2021054659A2
WO2021054659A2 PCT/KR2020/011896 KR2020011896W WO2021054659A2 WO 2021054659 A2 WO2021054659 A2 WO 2021054659A2 KR 2020011896 W KR2020011896 W KR 2020011896W WO 2021054659 A2 WO2021054659 A2 WO 2021054659A2
Authority
WO
WIPO (PCT)
Prior art keywords
marker
surgical
robot
posture
correlation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2020/011896
Other languages
English (en)
Korean (ko)
Other versions
WO2021054659A3 (fr
Inventor
김봉오
임흥순
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Curexo Inc
Original Assignee
Curexo Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Curexo Inc filed Critical Curexo Inc
Publication of WO2021054659A2 publication Critical patent/WO2021054659A2/fr
Publication of WO2021054659A3 publication Critical patent/WO2021054659A3/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/00234Surgical instruments, devices or methods for minimally invasive surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • A61F2/46Special tools for implanting artificial joints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • A61F2/46Special tools for implanting artificial joints
    • A61F2/4603Special tools for implanting artificial joints for insertion or extraction of endoprosthetic joints or of accessories thereof
    • A61F2/461Special tools for implanting artificial joints for insertion or extraction of endoprosthetic joints or of accessories thereof of knees
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00128Electrical control of surgical instruments with audible or visual output related to intensity or progress of surgical action
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3904Markers, e.g. radio-opaque or breast lesions markers specially adapted for marking specified tissue
    • A61B2090/3916Bone tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3983Reference marker arrangements for use with image guided surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • A61F2/46Special tools for implanting artificial joints
    • A61F2002/4632Special tools for implanting artificial joints using computer-controlled surgery, e.g. robotic surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/30Joints
    • A61F2/46Special tools for implanting artificial joints
    • A61F2002/4635Special tools for implanting artificial joints using minimally invasive surgery

Definitions

  • the present invention relates to a surgical navigation device and a method thereof, and more particularly, to a surgical navigation device and a method for detecting whether a marker attached to a surgical object is deformed.
  • the surgical path is determined in advance, an optical marker is mounted on the object to be operated, and the position and posture of the optical marker are tracked with an optical sensor, The operation is performed by monitoring the position of the robot.
  • a marker mounted on an object to be operated is often deformed during surgery due to external force or the like.
  • a problem occurs in estimating the position and posture of the object to be operated on.
  • an additional marker or instrument was attached to an object to be operated to determine whether it was deformed.
  • the present invention has been proposed to solve the above problems, and is to provide a surgical navigation device and method capable of easily detecting whether a marker attached to a surgical object is deformed during surgery.
  • the present invention has been proposed to solve the above problems, and is to provide a surgical navigation device and a method that can be easily recovered when the marker is deformed.
  • a surgical navigation device for achieving the above object includes a first image including an object marker attached to an object to be operated on and a second image on the object to be operated before surgery to match the object marker attached to the object to be operated and the operation.
  • An object matching unit for deriving a correlation with respect to the position and posture between the objects;
  • a reference position storage unit for setting and storing a reference position with respect to at least one reference point of the surgical subject;
  • a position calculating unit that receives the position and posture information of the target marker from a tracker and calculates the position of the reference point of the operation subject based on the derived correlation;
  • a marker deformation determination unit determining that the target marker is deformed when the position of the reference point calculated by the position calculating unit deviates from the reference position.
  • a surgical navigation device for achieving the above object includes a first image including an object marker attached to an object to be operated on and a second image on the object to be operated before surgery to match the object marker attached to the object to be operated and the operation.
  • An object matching unit for deriving a correlation with respect to the position and posture between the objects;
  • a reference position storage unit that calculates and stores a reference position relationship with respect to a changeable position of the target marker using at least one reference point of the object to be operated as a reference position;
  • a marker deformation determination unit that receives position and posture information of the object marker from a tracker, and determines that the object marker is deformed when the position and posture of the object marker deviates from the reference position relationship.
  • a correlation regarding the position and posture between the robot marker and the surgical object, and a correlation regarding the position and posture between the robot marker and the reference point are derived.
  • the surgical navigation method for achieving the above object includes a first image including an object marker attached to an object to be operated on with a second image about the object to be operated before surgery, and the object marker attached to the object to be operated and the operation Deriving a correlation with respect to the position and posture between the objects; Setting and storing a reference position with respect to at least one reference point of the surgical subject; Receiving position and posture information of the object marker from a tracker, and calculating a position of the reference point of the operation object based on the derived correlation; And determining that the object marker is deformed when the calculated position of the reference point deviates from the reference position.
  • the present invention it is possible to easily detect whether a marker is deformed in real time by setting a reference position with respect to a reference point on an object to be operated and tracking the position of the reference point during surgery.
  • FIG. 1 schematically shows a surgical navigation system according to an embodiment of the present invention.
  • FIG. 2 is a control block diagram of a surgical navigation device according to an embodiment of the present invention
  • FIG. 3 is a view for explaining the operation of the control unit of the surgical navigation device according to an embodiment of the present invention.
  • FIG 4 is for explaining the operation of the marker deformation determination unit according to an embodiment of the present invention.
  • FIG. 5 is a block diagram of a control unit of a surgical navigation device according to an embodiment of the present invention.
  • FIG 6 is for explaining the operation of the robot matching unit according to an embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating a method of determining whether an object marker is deformed by the surgical navigation device according to an embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating a method of determining whether an object marker is deformed by a surgical navigation device according to another embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a method of determining whether an object marker is deformed by a surgical navigation device and a recovery method according to another embodiment of the present invention.
  • the surgical navigation system according to an embodiment of the present invention includes an object marker 10 and 20 attached to a surgical object 1 and 2, a surgical robot 30, a tracker 40, and a surgical navigation system. Including the device 100.
  • the object to be operated (1, 2) refers to an object of surgery, and in the embodiment of the present invention, the object to be operated is a knee joint of the femur (1) (Femur) and the tibia (2) (Tibia), and the object marker 10, 20) will be described as an example that is attached to each of the femur (1) and tibia (2).
  • the surgical robot 30 is for joint surgery and includes a robot base and an arm, and a cutting tool may be positioned on an end effector of the arm.
  • a robot marker 31 is attached to the base of the surgical robot 30.
  • Optical markers may be used for the object markers 10 and 20 and the robot marker 31, and three or four bar types in different directions based on the center point are formed, and at the ends of the bars, respectively.
  • a highly reflective ball marker may be formed.
  • the tracker 40 is for tracking the position and posture of the robot marker 31 attached to the surgical robot 30 and the object markers 10 and 20 attached to the surgical objects 1 and 2, and are coordinated in three-dimensional space.
  • the position and posture of the image marker is sensed and transmitted to the surgical navigation device 100 to be described later.
  • the tracker 40 will be described as an example that is implemented as an optical tracking system (OTS).
  • OTS optical tracking system
  • the surgical navigation device 100 performs registration of the surgical objects 1 and 2 and the registration of the surgical robot 30, and receives a signal input from the tracker 40 to determine whether the object markers 10 and 20 are deformed. For determination, as shown in FIG. 1, it may be implemented including a computer or a microprocessor and a display. In FIG. 1, the surgical navigation device 100 is shown to be separated from the surgical robot 30 and implemented as a separate device, but in some cases, the computer or microprocessor of the surgical navigation device is installed in the surgical robot 30, The display of the surgical navigation device 100 may be connected to the tracker 40 and installed together. In this case, the surgical robot 30 is connected to the tracker 40 to receive the location/position information of the markers, processes it, and provides it to the display.
  • a surgical navigation device 100 includes a signal receiving unit 110, a user input unit 120, a display unit 130, a memory unit 140, and a control unit 150. Includes.
  • the signal receiving unit 110 is for receiving a signal from the outside, for example, an HDMI (High Definition Multimedia Interface) connector 11, a D-sub connector, or an Internet network for connection with an external device. It may include a communication module for connecting to other wired/wireless networks. The signal receiving unit 110 may include a wired/wireless communication module for interworking with the tracker 40 and the surgical robot 30.
  • HDMI High Definition Multimedia Interface
  • D-sub connector D-sub connector
  • Internet network for connection with an external device. It may include a communication module for connecting to other wired/wireless networks.
  • the signal receiving unit 110 may include a wired/wireless communication module for interworking with the tracker 40 and the surgical robot 30.
  • the user input unit 120 is for receiving a command from the user and transmitting it to the control unit 150 to be described later, and may include at least one of various user input means such as a keyboard, a mouse, and a button.
  • the display unit 130 is for displaying an image on a screen, such as a liquid crystal display (LCD) panel, a light emitting diode (LED) panel, an organic light emitting diode (OLED) panel, etc. It can be implemented as
  • the memory unit 140 may store various OSs, middleware, platforms, and various applications of the surgical navigation device 100, and store program codes, signal-processed image signals, audio signals, and data.
  • the memory unit 140 stores an operation target image, such as a patient CT image, of the operation object 1 and 2 acquired before the operation.
  • the memory unit 140 may be implemented as a read only memory (ROM), an erasable programmable read-only memory (EPROM), a random access memory (RAM), or the like.
  • the control unit 150 is in charge of overall control of the surgical navigation device 100 by a user command or an internal program input through the user input unit 120.
  • the control unit 150 may be implemented by including a computer program code for signal processing and control and a microprocessor executing the computer program.
  • the control unit 150 performs image matching and position tracking using position/position information received from the tracker 40 through the signal receiving unit 110, and detects whether the object markers 10 and 20 are deformed. Also, the controller 150 performs recovery when the object markers 10 and 20 are deformed.
  • control unit 150 includes an object matching unit 151, a reference position storage unit 153, and a marker deformation determining unit 155.
  • the object matching unit 151 includes a first image (optical image) including the object markers 10 and 20 obtained from the tracker 40 and a second image of the patient's surgical objects 1 and 2 taken before surgery. (E.g., 3D CT image) is matched to derive a correlation with respect to the position/position between the bone markers 10 and 20 (bone markers) attached to the surgery subjects 1 and 2 and the surgery subjects 1 and 2 . After attaching the object markers 10 and 20 to the surgical objects 1 and 2, the surgical objects 1 and 2 are in contact with and scratched by a plurality of points of the surgical objects 1 and 2 using a probe. And the position/position of the object markers 10 and 20 is recognized.
  • a first image optical image
  • 3D CT image 3D CT image
  • the object matching unit 151 receives an optical image about the position/position indicated by the probe through the tracker 40, and performs matching with 3D data of a patient, for example, a CT image previously stored in the memory unit 140, It is possible to derive a correlation regarding the position/position between the markers 10 and 20 and the surgical objects 1 and 2.
  • the object matching unit 151 may be implemented including a software algorithm for image matching.
  • the'surgical object' is also used in a broad sense of an object to be operated such as the femur (1) and the tibia (2). It is also used as an agreement to indicate a specific location or surgical site.
  • the object to be operated is a knee joint of the femur 1 and the tibia 2, and the object markers 10 and 20 are attached to the femur 1 and the tibia 2.
  • FMC Femur Marker Coordinate
  • HC Hip Coordinate
  • HJC Joint Center
  • the object registration unit 151 is based on a correlation between the position/position of the femur marker 10 and the position/position of the hip joint center point (HJC) of the femur 1 through image registration, for example, based on the femur marker 10
  • a transformation matrix (F T H ) can be derived as a coordinate transformation relationship between coordinate systems based on the hip joint center (HJC) of the femur 1 with respect to one coordinate system.
  • control unit 150 obtains the position and posture information of the femur marker 10 from the tracker 40, and converts the obtained position and posture information of the femur 1 into a transformation matrix derived from the object matching unit 151 It is possible to derive the position and posture of the center (HJC) of the hip joint of the femur (1) by multiplying it by ( F T H ).
  • TMC Tibia Marker Coordinate
  • AJC ankle joint center
  • T T A The transformation matrix (T T A ), which is the coordinate transformation relationship between the coordinate systems based on the ankle joint center (AJC) of the tibia 2 with respect to the coordinate system, is derived.
  • control unit 150 obtains the position and posture information about the tibia marker 20 from the tracker 40, and converts the position and posture information of the tibia marker 20 derived from the operation object matching unit 151
  • the position and posture of the ankle joint center (AJC) of the tibia 2 can be derived by multiplying the matrix ( T T A ).
  • the object matching unit 151 may derive a correlation with respect to the position/position of a plurality of points such as an implant origin. Accordingly, if the location/position of the object markers 10 and 20 is tracked by the tracker 40, the positions and postures of a plurality of points of the surgical objects 1 and 2 can be derived.
  • the surgical robot 30 When the registration of the surgical objects 1 and 2 is completed, the surgical robot 30 is moved close to the surgical objects 1 and 2 to prepare for surgery. At this time, the surgical objects 1 and 2 are fixed.
  • the hip joint center and the ankle joint center are fixed to prevent physical movement, and the hip joint center and the ankle joint center are fixed.
  • the femur (1) and tibia (2) may be in a moving or fixed state.
  • the reference position storage unit 153 is based on the correlation between the object markers 10 and 20 derived from the object matching unit 151 and the surgical objects 1 and 2, Set and save the reference position for the reference point.
  • the reference location storage unit 153 may be implemented by a memory or a register.
  • the reference point refers to a point used as a reference for movement in the surgical objects 1 and 2, and may be set differently according to the type of the surgical objects 1 and 2 or the type of surgery.
  • the hip joint center is set as a reference point, and the position and posture of the corresponding point after fixing the surgical objects (1, 2) are stored as a reference position
  • the ankle joint center is referenced.
  • the reference position of the reference point of the surgical object 1 and 2 stored in the reference position storage unit 153 means a position and posture based on the coordinate system of the tracker 40.
  • the position calculating unit 154 receives the position and posture information of the object markers 10 and 20 from the tracker 40, and based on the correlation derived from the object matching unit 151, the Calculate the position of the reference point.
  • the location calculation unit 154 may be implemented by including a software algorithm for location calculation.
  • the position calculation unit 154 calculates the position of the hip joint center by tracking the position and posture of the femur marker 10, and tracks the position and posture of the tibia marker 20 to track the position and posture of the ankle joint. Calculate the location.
  • the position calculation unit 154 continuously calculates the position of the reference point at a certain period during the operation. At this time, the position calculation unit 154 calculates the position and posture of the reference point based on the coordinate system of the tracker.
  • the marker deformation determination unit 155 is for determining whether a marker is deformed, and may be implemented including a software algorithm. When the position of the reference point calculated from the position calculation unit 154 deviates from the previously stored reference position, the marker deformation determination unit 155 determines that the object markers 10 and 20 are deformed, and when the position of the reference point calculated by the position calculation unit 154 is the same as the previously stored reference position, It is judged as normal. 3, the femur 1 is rotatable with the hip joint center as a reference point, and the tibia 2 is rotatable with the ankle joint center as the reference point.
  • the femur marker 10 may be located on the spherical surface of A1
  • the tibia marker 20 may be located on the spherical surface of A2. Since the correlation of the position and posture between the object markers 10 and 20 and the object does not change, even if the object markers 10 and 20 move during surgery, the position of the reference point, e.g., the hip joint center and the ankle joint center, will be It should be the same as the standard position of.
  • the marker deformation determination unit 155 determines whether the position of the reference point calculated from the position calculation unit 154 is the same as the reference position stored in the reference position storage unit 153, and if the same is determined as normal, and is not the same. If not, it is determined that the marker has been deformed.
  • the position calculation unit 154 acquires the position and posture information of the femur marker 10 from the tracker 40, and uses the coordinate correlation ( F T H ) calculated by the object registration unit 151 to be used for the operation object 1
  • the reference point of ,2) for example, the position of the hip joint center is calculated.
  • the femur marker 10 rotates to the right from the origin position based on the existing FMC coordinate system to move to the posture of the FMC' coordinate system, and accordingly, the hip joint center is moved from the existing HJC position to the HJC' position.
  • the marker deformation determination unit 155 determines whether the HJC position, which is the position of the reference point previously stored in the reference position storage unit 153, and the HJC′ position, which is the position of the currently tracked reference point, are the same. As shown in FIG. 4, since the reference position HJC of the reference point and the current position HJC' are different, the marker deformation determination unit 155 may determine that the marker is deformed due to an external force or the like.
  • the position calculation unit 154 acquires the position and posture information of the tibia marker 20 from the tracker 40, and uses the coordinate correlation T T A calculated by the object registration unit 151 to be used for the operation object 1 , 2), for example, the position of the center of the ankle joint is calculated.
  • the tibia marker 20 moves in parallel downward from the origin of the existing TMC coordinate system and moves to the origin of the TMC' coordinate system, and accordingly, the ankle joint center is moved from the existing AJC position to the AJC' position.
  • the marker deformation determination unit 155 determines whether the AJC, which is the position of the reference point previously stored in the reference position storage unit 153, and the AJC′, which is the position of the currently tracked reference point, are the same. As shown in FIG. 4, since the reference position AJC and the current position AJC' of the reference point are different, the marker deformation determination unit 155 may determine that the marker is deformed due to an external force or the like.
  • the surgical navigation device may further include a GUI generator 156.
  • the GUI generator 156 When a marker deformation is detected, the GUI generator 156 generates a message informing it and transmits the message to be displayed on the display unit 130.
  • the GUI generator may include a graphic processing module, for example, a graphic card, which processes data and generates an image. The user can know that the marker has been deformed through the message displayed on the screen.
  • 5 is a block diagram of the control unit 150 of the surgical navigation device according to an embodiment of the present invention.
  • the surgical navigation device according to the present embodiment may further include a robot matching unit 152b as the matching unit 152 and further include a recovery unit 157 as compared to the above-described embodiment. have.
  • the robot matching unit 152b is configured to determine the correlation with respect to the position and posture between the robot marker 31 and the surgical objects 1 and 2, and the correlation with respect to the position and posture between the robot marker 31 and the reference point. To derive.
  • the robot matching unit may be implemented including a software algorithm for position matching.
  • the robot matching part 152b attaches the robot marker 31 to the arm and the robot base of the robot, and moves the robot arm to determine the position and posture of the robot marker 31 by using the tracker 40.
  • the tracker 40 By tracking through, a correlation between the position/position of the robot base and the position/position of the robot marker 31 is derived to perform robot registration.
  • the surgical robot 30 is placed as an operable area, and the robot marker 31 and the object markers 10 and 20 installed on the base of the surgical robot 30 through the tracker 40, and the surgical object ( The relationship between the position and posture between 1 and 2) is derived.
  • the robot matching unit 152b is matched based on the coordinate system of the surgical robot 30 or the coordinate system of the robot marker 31 and the objects 1 and 2, the object markers 10 and 20, and the robot marker 31 ) Can derive the correlation regarding the position and posture.
  • 6 is for explaining the operation of the robot matching unit 152b according to an embodiment of the present invention.
  • RM Robot Marker
  • RMC Robot Marker Coordinate
  • the robot matching unit 152b is based on the position and posture information of the markers acquired from the tracker 40, and the correlations ( R T F , R) between the robot marker 31 and the object markers 10 and 20 on the position and posture. T T ) is derived. At this time, the robot matching unit 152b is based on the correlation (F T H , T T A ) with respect to the position and posture between the object markers 10 and 20 calculated by the object matching unit 151 and the reference point. It is possible to derive correlations (R T H , R T A ) regarding the position and posture between (31) and the surgical subject (1, 2).
  • R T H R T F x F T H
  • R T A R T T x T T A
  • R T F is the transformation matrix of the femur marker 10 to the robot marker 31
  • R T T is the transformation matrix of the tibia marker 20 to the robot marker 31
  • R T H is the robot marker 31
  • R T A denotes a transformation matrix at the center of the ankle joint for the robot marker 31.
  • the correlation (R T H , R T A ) about the position and posture between the robot marker 31 calculated by the robot matching unit 152b and the reference position of the reference point of the surgical object 1 and 2 is a reference position storage unit It is stored in 153.
  • correlations (R T F , R T T ) about the position and posture between the robot marker 31 and the object markers 10 and 20 calculated by the robot matching unit 152b are also used for the reference position storage unit ( 153).
  • the reference position storage unit 153 stores the reference position of the reference point of the surgical object (1, 2), but the reference point based on one of the robot marker 31 or the coordinate system of the base of the surgical robot 30 It is possible to store more location and posture information.
  • position and posture information of a reference point based on a fixed third position that does not move is required. Since the tracker 30 may be moved during surgery, in the present embodiment, the position and posture information of the reference point for recovery is not moved and is a coordinate system based on the surgical robot 30, that is, the surgical robot 30 ), the position and posture information of the reference position based on the coordinate system of the base or the coordinate system of the robot marker 31 are stored and used for recovery. In addition, in the present embodiment, the marker deformation determination unit 155 calculates the position of the reference point based on the coordinate system of the base of the robot marker 31 or the surgical robot 30, and the value stored in the reference position storage unit 153 By comparison, it is possible to determine whether or not the marker is deformed.
  • the marker deformation determination unit 155 calculates the position of the reference point based on the coordinate system of the tracker 40 and stores a value stored in the reference position storage unit 153 (e.g., a reference position based on the tracker coordinate system) and By comparison, it is also possible to determine whether or not the marker is deformed.
  • the recovery unit 157 is positioned before and after deformation of the object markers 10 and 20 based on a change in the correlation between the object markers 10 and 20 and the robot marker 31 before and after deformation of the object markers 10 and 20 And a correlation with respect to the posture, and reestablish the correlation between the object markers 10 and 20 and the surgical objects 1 and 2 based on the correlation before and after the deformation of the object markers 10 and 20.
  • the recovery unit 167 may be implemented including a software algorithm for calculating a location.
  • FIG. 7 is for explaining the operation of the recovery unit 157 according to an embodiment of the present invention.
  • the femur marker 10 rotates in the femur 1, and the tibia marker 20 is translated in the vertical direction in the tibia 2 It is assumed that it has occurred.
  • Marker deformation determination unit 155 is the position of the reference point calculated by the position calculation unit 154, for example, the hip joint center position (HJC') and the ankle joint center position (AJC') is the reference position, such as HJC and AJC It is determined whether it is the same as or not, and it is determined that deformation has occurred in both the femur marker 10 and the tibia marker 20.
  • the recovery unit 157 correlates the position and posture between the object markers 10 and 20 and the surgical objects 1 and 2 based on the changed marker. Reset the relationship.
  • the correlation between the object markers 10 and 20 after the deformation of the femur marker 10 and the robot marker 31 can be calculated from the position/position acquired through the tracker 40, and in FIG. 7, the femur marker 10 It is represented by the transformation matrix (R T F' ) of the tibia marker 20 with respect to the robot marker 31 after the deformation of
  • the reference position storage unit 153 stores the transformation matrix of the femur marker 10 with respect to the robot marker 31 before the marker deformation occurs.
  • the recovery unit 157 uses the transformation matrix of the femur marker 10 with respect to the robot marker 31 before and after the deformation of the femur marker 10 occurs, and the femur marker 10 before and after the deformation of the femur marker 10 occurs.
  • the correlation of the change in position and posture of the patient can be derived as follows.
  • F T F is the transformation matrix, R T F on the change of position and attitude of the deformation before and after the femoral marker 10 of the femoral marker 10, the robot after the deformation of the femoral marker 10, the marker
  • R T F is the transformation matrix of the femur marker 10 with respect to the robot marker 31 before the deformation of the femur marker 10 occurs
  • R T H is the femur marker ( The transformation matrix of the hip joint center (reference position) with respect to the robot marker 31 before deformation of 10)
  • F T H is the hip joint center (reference position) with respect to the femur marker 10 before the deformation of the femur marker 10 occurs.
  • the recovery unit 157 is a femur marker after deformation of the femur marker 10 based on the correlation (F T F′ ) with respect to the change in the position and posture of the femur marker 10 before and after the deformation of the femur marker 10 occurs. It can be reset by deriving the correlation between (10) and the hip joint center (HJC).
  • T H denotes the transformation matrix of the hip joint center (HJC) according to a modified femoral marker 10 after the femoral marker 10)
  • the recovery unit 157 uses the transformation matrix of the tibia marker 20 with respect to the robot marker 31 before and after the deformation of the tibia marker 20 occurs, and the tibia before and after the deformation of the tibia marker 20 occurs.
  • the correlation of the change in the position and posture of the marker 20 can be derived as follows.
  • T T T' inv( R T T )x R T T'... ... ... ... ... ... ... ... ... ... ... ... (3)
  • R T T' is the robot after the deformation of the tibia marker 20 marker
  • R T T is the transformation matrix of the tibia marker 20 with respect to the robot marker 31 before the deformation of the tibia marker 20 occurs
  • R T A is the tibia marker
  • T T A is the center of the angle joint with respect to the tibia marker 20 before the deformation of the tibia marker 20 occurs (reference position ) Means a transformation matrix of ).
  • the recovery unit 157 is a tibial marker after deformation of the tibia marker 20 based on a correlation (T T T′ ) with respect to a change in the position and posture of the tibia marker 20 before and after deformation of the tibia marker 20 occurs. It can be reset by deriving the correlation between (20) and the center of the ankle joint (AJC).
  • T 'T A inv (T T T') x T T A
  • T 'T A denotes a transformation matrix of the ankle joint center (AJC) according to a modified tibial marker 20 after the tibia marker 20)
  • the correlation between the object markers 10 and 20 reset by the recovery unit 157 and the surgical objects 1 and 2 is stored in the object matching unit 152a and the robot matching unit 152b, and the position calculating unit 154 ) May track the position and posture of the surgical objects 1 and 2 based on the correlation between the newly set object markers 10 and 20 and the surgical objects 1 and 2.
  • the robot marker 31 is deformed due to an external force or the like, it is preferable to separately provide a restoration marker for confirming the robot marker 31.
  • the marker deformation determination unit 155 acquires the position and posture information of the object markers 10 and 20 from the tracker 40, and stores the position and posture information of the object markers 10 and 20 in the reference position storage unit 153. It is possible to determine whether the marker is deformed by determining whether the reference position relationship is satisfied.
  • the object markers 10 and 20 may be positioned only on the sphere surfaces of A1 and A2 based on the reference position of the reference point. For example, the spherical surfaces of A1 and A2 may be the reference positional relationship.
  • the position and posture of the reference point is calculated from the position and posture information of the object markers 10 and 20, whereas in the present embodiment, the position and posture information of the object markers 10 and 20 determine the reference position relationship. The difference is that you make sure you are satisfied.
  • the tracker 40 is first performed by recognizing the positions of the target markers 10 and 20 and the targets 1 and 2 using a probe.
  • the first image of the marker acquired by) and the second image of the surgical objects 1 and 2 acquired before the surgery are matched (S10).
  • S10 a correlation between the object markers 10 and 20 and the surgical objects 1 and 2 is derived (S11).
  • a reference position with respect to a reference point which is a reference of the movement of the object, is set among the plurality of points of the surgical object 1 and 2, and the corresponding position value is stored (S12).
  • the surgical objects 1 and 2 include the femur 1 and the tibia 2, and the reference point includes the hip joint center and the ankle joint center.
  • the reference position of the reference point is stored as a position value based on the coordinate system of the tracker 40.
  • the position and posture of the target markers 10 and 20 are acquired through the tracker 40 during surgery (S13), and the target markers 10 and 20 calculated in the above-described registration process and the targets 1 and 2 are referenced.
  • the position and posture of the reference point is derived by using the correlation between the position and posture of the point (S14).
  • the correlation between the object markers 10 and 20 attached to the surgical objects 1 and 2 and the reference point is not changed by the movement of the tracker 40 or the movement of the surgical objects 1 and 2.
  • the marker deformation determination unit 155 determines whether the position of the reference point is the same as the reference position stored in the reference position storage unit 153 (S15). If there is a discrepancy, it is determined that the object markers 10 and 20 are deformed (S16), and if they do match, it is determined that the object markers 10 and 20 are in a normal state (S17).
  • FIG. 9 is a flowchart illustrating a method of determining whether object markers 10 and 20 are deformed by the surgical navigation device according to another embodiment of the present invention. A description redundantly with the above-described embodiment will be omitted.
  • a tracker 40 is first performed through a process of recognizing the positions of the target markers 10 and 20 and the surgical targets 1 and 2 using a probe.
  • the first image of the marker acquired by) and the second image of the surgical objects 1 and 2 acquired before the surgery are matched (S20).
  • S21 a correlation between the object markers 10 and 20 and the surgical objects 1 and 2 is derived (S21).
  • the reference position relationship with respect to the changeable position of the object markers 10 and 20 is calculated based on the position of the reference point, which is the reference point of the movement of the object, among the plurality of points of the surgical object 1 and 2 It is stored in the location storage unit 153 (S22).
  • the spherical surfaces of A1 and A2 may be the reference positional relationship.
  • the position and posture of the target markers 10 and 20 are acquired through the tracker 40 during surgery (S23), and the position and posture of the target markers 10 and 20 are stored in the reference position storage unit 153. It is determined whether or not (S24). If the position and posture of the object markers 10 and 20 are out of the reference positional relationship, it is determined as the deformation of the object markers 10 and 20 (S25), and if the reference positional relationship is satisfied, the object marker 10, 20) is determined to be in a normal state (S26).
  • FIG. 10 is a flowchart illustrating a method of determining whether object markers 10 and 20 are deformed and recovering by a surgical navigation device according to another embodiment of the present invention.
  • a tracker 40 is provided through a process of recognizing the positions of the target markers 10 and 20 and the surgical targets 1 and 2 using a probe.
  • the first image of the marker acquired by the method and the second image of the surgical objects 1 and 2 acquired before the surgery are matched (S30).
  • the correlation of the position and posture between the robot and the robot marker 31 is derived through the registration of the robot.
  • the robot is moved to the operable area and placed, and the surgical objects 1 and 2 are fixed.
  • the correlation between the position and posture between the robot marker 31 and the object and the position and posture between the robot marker 31 and the surgical objects 1 and 2 are derived through the tracker 40 (S31).
  • a reference position with respect to a reference point which is a reference point for movement of the object, is set among the plurality of points of the surgical object 1 and 2, and the corresponding position value is stored (S32).
  • the reference position with respect to the reference point includes position and posture information based on at least one of the coordinate system of the tracker 40, the coordinate system of the robot marker 31, and the coordinate system of the base of the surgical robot 30.
  • the reference position of the reference point based on the coordinate system of the robot marker 31 or the surgical robot 30 may be used for recovery, and the determination of whether the object marker is deformed is the coordinate system of the tracker 40, the robot marker. It may be based on any one of the coordinate system of (31) and the coordinate system of the surgical robot (30).
  • correlations (R T H , R T A ) regarding the position and posture between the robot marker 31 calculated by the robot matching unit 152b and the reference position of the reference point, and the robot marker 31 and the object marker are stored in the reference position storage unit 153 at the time of recovery, and are used at the time of recovery.
  • the position and posture of the object markers 10 and 20 are acquired through the tracker 40 (S33), and the reference points of the object markers 10 and 20 calculated in the above-described process and the operation objects 1 and 2
  • the position and posture of the reference point is calculated using the correlation between the position and posture of (S34).
  • the marker deformation determination unit 155 determines whether the position of the reference point is the same as the reference position stored in the reference position storage unit 153 (S35). If there is a discrepancy, it is determined that the object markers 10 and 20 are deformed (S36), and recovery is performed by the recovery unit 157. Meanwhile, when the position of the reference point is the same as the reference position, it is determined that the object markers 10 and 20 are in a normal state (S39).
  • the recovery unit 157 uses the robot marker 31 to determine the correlation with respect to the position and posture between the object markers 10 and 20 and the surgical objects 1 and 2 based on the changed object markers 10 and 20. Reset. The recovery unit 157 determines the position and posture of the object markers 10 and 20 based on a change in the correlation between the object markers 10 and 20 and the robot marker 31 before and after deformation of the object markers 10 and 20 occurs. Derive a correlation for the change of (S37), and based on this, the object markers 10 and 20 after the deformation of the object markers 10 and 20 and the surgical object 1 and 2, for example, the surgical object 1 and 2 It can be reset by deriving the correlation between the reference points of (S38).
  • the correlation between the object markers 10 and 20 reset by the recovery unit 157 and the surgical objects 1 and 2 is stored in the object matching unit 152a and the robot matching unit 152b, and the position calculating unit 154 ) May track the position and posture of the surgical objects 1 and 2 based on the correlation between the newly set object markers 10 and 20 and the surgical objects 1 and 2.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Robotics (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Transplantation (AREA)
  • Pathology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Vascular Medicine (AREA)
  • Cardiology (AREA)
  • Gynecology & Obstetrics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Manipulator (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne un dispositif et un procédé de navigation chirurgicale. Un dispositif de navigation chirurgicale selon la présente invention comprend : une unité de mise en correspondance d'objet pour mettre en correspondance une première image comprenant un marqueur d'objet fixé à un objet chirurgical et une seconde image concernant l'objet chirurgical avant une chirurgie, ce qui permet de déduire une corrélation concernant des positions et des postures entre le marqueur d'objet fixé à l'objet chirurgical et l'objet chirurgical ; une unité de stockage de position de référence pour configurer et stocker une position de référence concernant au moins un point de référence de l'objet chirurgical sur la base de la corrélation entre le marqueur d'objet et l'objet chirurgical dérivée par l'unité de mise en correspondance d'objet ; une unité de calcul de position pour recevoir des informations concernant la position et la posture du marqueur d'objet et calculer la position du point de référence de l'objet chirurgical sur la base de la corrélation dérivée ; et une unité de détermination de déformation de marqueur pour déterminer que, lorsque la position du point de référence calculée par l'unité de calcul de position a dévié de la position de référence, le marqueur d'objet a été déformé. Par conséquent, il est possible de détecter facilement en temps réel si le marqueur d'objet a été déformé.
PCT/KR2020/011896 2019-09-18 2020-09-03 Dispositif et procédé de navigation chirurgicale Ceased WO2021054659A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190114451A KR102274175B1 (ko) 2019-09-18 2019-09-18 수술 내비게이션 장치 및 그 방법
KR10-2019-0114451 2019-09-18

Publications (2)

Publication Number Publication Date
WO2021054659A2 true WO2021054659A2 (fr) 2021-03-25
WO2021054659A3 WO2021054659A3 (fr) 2021-05-14

Family

ID=74884632

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/011896 Ceased WO2021054659A2 (fr) 2019-09-18 2020-09-03 Dispositif et procédé de navigation chirurgicale

Country Status (2)

Country Link
KR (1) KR102274175B1 (fr)
WO (1) WO2021054659A2 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115607286A (zh) * 2022-12-20 2023-01-17 北京维卓致远医疗科技发展有限责任公司 基于双目标定的膝关节置换手术导航方法、系统及设备
CN117243699A (zh) * 2023-11-14 2023-12-19 杭州三坛医疗科技有限公司 一种移位检测方法及装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10145587B4 (de) * 2001-09-15 2007-04-12 Aesculap Ag & Co. Kg Verfahren und Vorrichtung zur Prüfung eines Markierungselementes auf Verrückung
KR101195994B1 (ko) * 2011-02-11 2012-10-30 전남대학교산학협력단 3차원 광학 측정기를 이용한 뼈 움직임 감지 및 경로 보정 시스템
WO2013189520A1 (fr) * 2012-06-19 2013-12-27 Brainlab Ag Procédé et appareil pour la détection d'une rotation indésirable de marqueurs médicaux
US9541630B2 (en) 2013-02-15 2017-01-10 Qualcomm Incorporated Method and apparatus for determining a change in position of a location marker
KR102296451B1 (ko) * 2014-12-08 2021-09-06 큐렉소 주식회사 중재시술 로봇용 공간정합 시스템
KR101650821B1 (ko) * 2014-12-19 2016-08-24 주식회사 고영테크놀러지 옵티컬 트래킹 시스템 및 옵티컬 트래킹 시스템의 트래킹 방법

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115607286A (zh) * 2022-12-20 2023-01-17 北京维卓致远医疗科技发展有限责任公司 基于双目标定的膝关节置换手术导航方法、系统及设备
CN117243699A (zh) * 2023-11-14 2023-12-19 杭州三坛医疗科技有限公司 一种移位检测方法及装置
CN117243699B (zh) * 2023-11-14 2024-03-15 杭州三坛医疗科技有限公司 一种移位检测方法及装置

Also Published As

Publication number Publication date
KR102274175B1 (ko) 2021-07-12
KR20210033563A (ko) 2021-03-29
WO2021054659A3 (fr) 2021-05-14

Similar Documents

Publication Publication Date Title
US11717351B2 (en) Navigation surgical system, registration method thereof and electronic device
WO2021045546A2 (fr) Dispositif de guidage de position de robot, procédé associé et système le comprenant
WO2021206372A1 (fr) Appareil et procédé de planification de chirurgie de la colonne vertébrale basés sur une image médicale bidimensionnelle
US5249581A (en) Precision bone alignment
WO2018080086A2 (fr) Système de navigation chirurgicale
US11259875B2 (en) Proximity-triggered computer-assisted surgery system and method
US20090306499A1 (en) Self-detecting kinematic clamp assembly
WO2020101283A1 (fr) Dispositif d'assistance chirurgicale mettant en œuvre la réalité augmentée
US20220110700A1 (en) Femoral medial condyle spherical center tracking
US20240341862A1 (en) Operating room remote monitoring
WO2021153973A1 (fr) Dispositif de fourniture d'informations de chirurgie robotique de remplacement d'articulation et son procédé de fourniture
WO2016195401A1 (fr) Système de lunettes 3d pour opération chirurgicale au moyen de la réalité augmentée
WO2014077613A1 (fr) Robot pour procédure de repositionnement, et procédé pour commander son fonctionnement
JP2022016415A (ja) 誘導型整形外科手術のための器具
WO2016154557A1 (fr) Procédés et systèmes pour chirurgie assistée par ordinateur au moyen d'une vidéo intra-opératoire acquise par une caméra à mouvement libre
WO2012171555A1 (fr) Procédé et dispositif pour déterminer l'axe mécanique d'un os
ES2974667T3 (es) Sistema de identificación de posición de marcador para cirugía ortopédica
WO2021054659A2 (fr) Dispositif et procédé de navigation chirurgicale
US20160022173A1 (en) Method and apparatus for determining a leg length difference and a leg offset
CN110464457A (zh) 手术植入规划计算机和由其执行的方法,以及手术系统
WO2022015877A1 (fr) Analyse dynamique d'articulation pour remplacement d'articulation
WO2012108578A1 (fr) Système de surveillance du mouvement des os et de correction d'une trajectoire à l'aide d'une unité de mesure optique en trois dimensions
WO2021162287A1 (fr) Procédé de vérification de la mise en correspondance d'une cible de chirurgie, appareil associé, et système le comprenant
WO2013105738A1 (fr) Appareil de commande de robot chirurgical et son procédé de commande
CN114795376A (zh) 一种关节置换辅助截骨系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20865697

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20865697

Country of ref document: EP

Kind code of ref document: A2