WO2024170384A1 - Robot médical avec différents effecteurs terminaux, système robotique et méthode de commande pour un robot médical - Google Patents
Robot médical avec différents effecteurs terminaux, système robotique et méthode de commande pour un robot médical Download PDFInfo
- Publication number
- WO2024170384A1 WO2024170384A1 PCT/EP2024/053072 EP2024053072W WO2024170384A1 WO 2024170384 A1 WO2024170384 A1 WO 2024170384A1 EP 2024053072 W EP2024053072 W EP 2024053072W WO 2024170384 A1 WO2024170384 A1 WO 2024170384A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- instrument
- unit
- robot
- visualization
- end effector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/20—Surgical microscopes characterised by non-optical aspects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00207—Electrical control of surgical instruments with hand gesture control or hand gesture recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00216—Electrical control of surgical instruments with eye tracking or head position tracking control
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00477—Coupling
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2059—Mechanical position encoders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/32—Surgical robots operating autonomously
Definitions
- the present disclosure relates to a medical, in particular surgical, collaborative robot for actuating an end effector.
- the disclosure relates to a medical robot system, a computer-implemented control method for a medical robot, a computer-readable storage medium and a computer program according to the preambles of the independent claims.
- Robot hand guiding surgical instruments
- robot eye visualization
- surgeons need both better instrument guidance and better visualization guidance during a procedure, i.e. an improvement in both modalities. This need is increasing in line with the trend towards smaller incisions, which limit the dexterity of a surgeon's hand and the surgeon's vision.
- One sub-task can be seen in using an end effector in a more space-saving, time-saving and yet more precise way.
- one sub-task is to be able to flexibly use different end effectors for an examination or an intervention intraoperatively, i.e. during the intervention, and to switch between them. It should also be possible to manufacture and maintain them cost-effectively.
- the object of the present disclosure is solved according to the invention with respect to a medical robot by the features of claim 1, with respect to a robot system by the features of claim 10, with respect to a computer-implemented control method for a robot by the features of claim 11, with respect to a computer-readable storage medium by the features of claim 12 and with respect to a computer program by the features of claim 13.
- a basic idea of the present disclosure therefore provides for only a single robot having a single robot base and in which in particular a medical, mobile cart forms the robot base, which can be moved freely in the room (operating room), to provide two different modalities in an optimized installation space, namely on the one hand a visualization modality using the visualization unit/visualization device (for visual capture using a recording) and on the other hand an instrument modality/intervention modality/manipulator modality using the instrument unit with the instrument.
- a medical robot with an (automatic) positioning and alignment of a visualization unit/imaging unit (as a first end effector) and an instrumentation unit (as a second end effector) is disclosed.
- the medical robot therefore has two end effectors, one for visualization or imaging and one for instrumentation for a corresponding manipulation of, for example, a patient's tissue.
- the end effectors can be mounted or (fixedly) supported or provided on a single robot arm or on two separate robot arms (one robot arm with visualization, one robot arm with instrument) but with a uniform robot base.
- the system preferably further comprises a tracking unit or sensor unit to localize the exact position and/or orientation, in particular the location, of the two end effectors in relation to the (registered) patient (in particular a position of an instrument tip) and to use various control mechanisms to move the robot to actuate one or the other end effector.
- a tracking unit or sensor unit to localize the exact position and/or orientation, in particular the location, of the two end effectors in relation to the (registered) patient (in particular a position of an instrument tip) and to use various control mechanisms to move the robot to actuate one or the other end effector.
- a medical, in particular surgical, collaborative robot for actuating an end effector during an examination or an intervention on a patient, comprising: a robot base as a local connection point of the robot, a movable and actuatable robot arm connected to the robot base with at least one robot arm segment, at least two end effectors connected to the one robot arm, wherein a first end effector is an (analog or digital) visualization unit (Z-device) with a visualization axis, and a second end effector is a medical instrument unit with a medical instrument and an associated instrument axis, in particular with a surgical instrument, and a control unit which is adapted to control the spatial position of the visualization unit or the spatial position of the instrument for an examination or an intervention by means of the actuatable robot arm.
- a robot base as a local connection point of the robot
- a movable and actuatable robot arm connected to the robot base with at least one robot arm segment, at least two end effectors connected to the one robot arm, wherein a
- a position of an instrument tip of the instrument can be adjusted via the position of the instrument unit with the instrument.
- an instrument tip is not in the field of view of the (optical) visualization unit and the optical visualization axis and the instrument axis are separate from each other.
- a single robot arm is therefore used to carry and guide the visualization unit and the instrument unit and thus serves the two modalities of visualization and instrument guidance.
- the robot has only one robot arm for the instrument, whereby the instrument unit for guiding the instrument is provided on the same robot arm as the visualization system, in particular is permanently mounted.
- the visualization unit and the instrument unit can be used separately (i.e. not simultaneously) during a surgical procedure. Two different modalities are provided with just one robotic arm.
- position means a geometric position in three-dimensional space, which is specified in particular by means of coordinates of a Cartesian coordinate system.
- the position can be specified by the three coordinates X, Y and Z.
- orientation in turn indicates an alignment (e.g. position) in space.
- orientation indicates an alignment with direction or rotation in three-dimensional space.
- orientation can be indicated using three angles.
- location includes both a position and an orientation.
- the location can be specified using six coordinates, three position coordinates X, Y and Z and three angular coordinates for the orientation.
- an instrument unit adapter can be arranged on the robot arm and/or the visualization unit, to which the medical instrument unit can be coupled and decoupled/uncoupled by means of a predefined, complementary counter adapter, in particular without tools, so that the instrument unit can be removed with the instrument and in particular exchanged for another, different instrument.
- the instrument unit for instrument guidance can thus be mounted and dismounted intraoperatively, i.e. during the operation, relative to the visualization system, via the adapted and specially suitable adapter interface.
- no tools need to be used (tool-free exchange), so that a simple manual operation is sufficient for a change during the operation. This saves time and simplifies the exchange process.
- another visualization unit in particular with an endoscope, can be mounted on the instrument unit adapter with a corresponding counter adapter.
- different “units” or “modules” can be exchanged via the "standardized" instrument unit adapter.
- a surgical microscope can be used as the (first main) visualization unit, an instrument can be initially coupled and used using the instrument unit and, during the course of the procedure, the instrument unit can be uncoupled and instead a further, second visualization unit in the form of an endoscope can be coupled in order to be able to use both the modality of the surgical microscope and that of the endoscope, in particular separately at different times, for example alternately.
- a special visualization unit adapter can be arranged on the robot arm and/or the (first main) visualization unit and/or on the instrument unit, to which, in the case that the (first) visualization unit is designed to be adaptable, the first visualization unit can be coupled and uncoupled to the robot arm or to the instrument unit, or in the case that a further (second) medical visualization unit is used, by means of a predefined, complementary counter adapter, in particular without tools, so that the visualization unit can be removed and in particular exchanged for a different visualization unit.
- the (first) visualization unit can be a surgical microscope and the (second) visualization unit coupled via the visualization unit adapter can be an endoscope, so that in addition to the modality of a surgical microscope, the collaborative robot can also control an endoscope and use it during an operation.
- the visualization unit can have a rigid spatial relation (rigid fixation) with respect to the medical instrument unit
- the control unit can be adapted to determine the position of the instrument unit via a predefined static transformation based on the position of the visualization unit or based on the position of the instrument unit to determine the position of the visualization unit in order to control the position of the other end effector, in particular based on the recorded position of one end effector.
- the visualization unit and the instrument unit (for instrument guidance) have a fixed spatial position or relation relative to one another.
- there is only a single (localization) sensor/tracker that spatially tracks/tracks either the visualization system or the instrument guidance. The position of the other unit can then be determined via the rigid relation.
- exactly one tracker or navigation sensor/localization sensor can be provided on the at least two end effectors, in particular either a tracker, in particular a rigid body with optical markers, is attached to the visualization unit or a tracker is attached to the instrument unit, and the control device is adapted to determine the position, in particular location, of both the instrument (with instrument tip) and the visualization unit by tracking the single tacker and the predefined rigid spatial relationship of the visualization unit to the instrument unit. With just a single sensor/tracker, the position, in particular location, of both the visualization unit and the instrument guide can be tracked.
- the position and/or orientation of a selected end effector of the at least two end effectors can be determined via a robot kinematics of the robot.
- the robot with its at least one robot arm therefore has an internal robot tracking system (without an external camera being required for tracking) that is designed based on kinematics.
- the robot can have sensors on the robot arm segments that detect the position of the robot arm segments relative to one another and (the control unit is adapted for this) thereby determines the position and/or orientation of the first and the second end effector in relation to the robot base (for example via a static transformation of the end effectors relative to one another and to the robot head).
- control unit can be adapted for this via an active control of the robot arm (for example with stepper motors) with a known transformation from the robot head to the selected end effector, to determine the position and/or orientation.
- the robot base has a tracker if it is not in a static spatial position in relation to the patient, i.e. is not firmly connected to the patient, which can be detected by a tracking system, in particular an external navigation camera.
- the patient can be registered with the robot via the optically tracked robot base and, together with the robot kinematics, the position and/or orientation of the selected end effector with respect to the patient can be controlled.
- the instrument axis of the instrument can be aligned such that it does not intersect the visualization axis, in particular the instrument axis and the visualization axis move away from each other starting from the two end effectors, so that the instrument does not impair the visualization of the visualization unit, in particular is not visible in the field of view of the visualization unit.
- the visualization axis of the visualization system and the instrument axis of the instrument guide preferably do not intersect, so that no instrument tip is visible in the field of view of the visualization system and the visualization system (as a modality) can be used to its full extent without disadvantages.
- the visualization axis can have a distance of at least 10 cm, preferably of at least 30 cm, particularly preferably of at least 50 cm from the instrument axis, starting from a front side of the visualization unit at a distance along the visualization axis of (at least 10 cm and/or) a maximum of 60 cm. This excludes any negative influence on the visualization even with longer instruments.
- the visualization unit can be a surgical microscope, in particular an optical (surgical) microscope or a digital (surgical) microscope, or an endoscope or an ultrasound probe, which generates 2D recordings (/-images) or 3D recordings (/-images), and/or the instrument unit can be a drilling or guide sleeve for trajectory guidance or a resection instrument, in particular a pair of forceps or a suction tube or a scalpel, or a cutting block, in particular a knee cutting block.
- a second, different visualization unit can also be attached to the robot arm, for example a microscope and an ultrasound probe.
- the visualization unit can therefore preferably be a microscope, an endoscope or an ultrasound probe.
- the visualization unit can generate two-dimensional 2D images/images or three-dimensional 3D images/images.
- the visualization unit can in particular be an optical microscope or a digital microscope.
- the instrument unit in turn can therefore be equipped with trajectory guides, such as a drill or guide sleeve, or with a resection instrument, such as a pair of forceps, a suction tube or a scalpel, or with a cutting block, such as a knee cutting block.
- trajectory guides such as a drill or guide sleeve
- a resection instrument such as a pair of forceps, a suction tube or a scalpel
- a cutting block such as a knee cutting block.
- 3D defines that the data is spatial, i.e. three-dimensional.
- the patient's body or at least a part of the body with spatial extension can be available digitally as recording data in a three-dimensional space with, for example, a Cartesian coordinate system (X, Y, Z).
- the medical robot can further comprise a navigation system with at least one display device, in particular an operating room monitor, and the control unit can control the robot with the end effectors on the basis of the navigation system, in particular approach waypoints for a selected end effector of the at least two end effectors and adjust a position of the selected end effector accordingly, preferably on the basis of a preoperatively defined operation plan.
- the navigation system enables the robot to automatically approach waypoints and, for example, automatically provide a visualization of a tissue or to place the instrument in the correct position and, for example, also carry out an automated operation on a tissue. In this way, an operation can be carried out and the result of the operation can be shown to the surgeon using the visualization unit.
- the control unit may be specifically adapted to: either navigate and control the instrument unit with the instrument and its instrument axis for navigation as the selected end effector for the procedure; or navigate and control the visualization unit with its visualization axis for navigation as the selected end effector for visualization.
- the medical robot thus offers a user the option of selecting either the visualization unit or the instrument unit to determine the corresponding axis of interest (either instrument axis or the visualization axis) that is tracked and moved by the robot (via the control unit and a tracking system, in particular navigation system).
- the robot arm and thus the end effector can be controlled manually by using a force sensor which is in particular attached to an end effector, preferably integrated in a handle attached to the corresponding end effector.
- a force sensor which is in particular attached to an end effector, preferably integrated in a handle attached to the corresponding end effector.
- the surgeon can select and control the corresponding end effector by operating the respective associated force sensor.
- the robot arm with the selected end effector can also perform an automatic movement derived from a preoperative (image-based) plan (for example based on CT images).
- the robot arm with the end effector can also be controlled by remote control device(s) such as voice control and/or an external joystick and/or gesture control and/or head control and/or eye control.
- the preoperative image-based plan may be a predefined trajectory to move the instrument unit or a predefined waypoint to move the visualization unit into.
- the robot arm for visualization by means of the visualization unit may be controlled by the control unit to follow the surgical tool, while the surgical tool is guided either by machine vision from the visualization unit itself or by an external tracking system such as a navigation system.
- the medical robot has only one robot arm with two end effectors, one for visualization and one for instrumentation.
- the two end effectors are in particular firmly connected to one another, so that only one localization system (in particular navigation tracker) is required.
- the two end effectors can preferably be coupled to a manual joint, so that two separate localization systems (in particular navigation tracker) are required.
- the joint can have between one and six degrees of freedom.
- the joint can be used, for example, for the rough alignment of the instrument guide, which means that the robot arm has to be moved less and is needed less for movements over long distances. After the rough alignment, the joint can be locked or fixed (a spatial relationship between the instrument unit and the visualization unit becomes statically fixed). Since the instrument guide (i.e. the instrument unit) is localized separately, the robot arm can be used for the fine adjustment of the instrument guide to enable precise alignment of the instrument.
- the medical robot may also have a robotic arm with interchangeable end effectors for the visualization unit and/or the instrument unit.
- the medical robot system can comprise a robot according to one of the preceding claims, wherein an instrument unit adapter is arranged on the robot arm and/or the visualization unit and the robot system further comprises at least two different instrument units with different instruments with (but) an identically designed counter adapter in order to equip the robot with different instruments as required via a uniform interface.
- an instrument unit adapter is arranged on the robot arm and/or the visualization unit and the robot system further comprises at least two different instrument units with different instruments with (but) an identically designed counter adapter in order to equip the robot with different instruments as required via a uniform interface.
- a set of instrument units can be provided which can even be changed intraoperatively. If the surgeon requires a scalpel as the first instrument in a first step and a surgical drilling tool or resection tool later in the procedure, the instrument unit with the scalpel can be removed and the new instrument unit with the drilling tool or resection tool can be attached to the Robot arm (detachable).
- a tool-free attachment is provided so that the change can be carried out quickly, safely and efficiently.
- Another advantage here is that only the instrument unit adapter has to be sterile, while the rest of the robot can be covered in a sterile manner. The only important thing is the interface with the instrument unit adapter.
- the medical robot system can also have a further visualization unit with the same designed counter adapter, which can be coupled to and uncoupled from the instrument unit adapter in order to provide a further visualization modality.
- the further connectable visualization unit can have an endoscope in order to provide an endoscope function during an intervention. In this way, a set of instrument units with instruments as well as visualization units is provided in the robot system.
- the object is achieved in that it comprises the steps: preferably registering the patient using a navigation system; selecting either a visualization unit with a visualization axis or an instrument unit with an instrument and an associated instrument axis as the selected end effector (target end effector), both of which are connected as end effectors to a robot arm with at least one robot arm segment of the robot, the robot arm in turn being connected to a robot base as a local connection point; tracking a position (position and orientation) of the selected end effector using the navigation system; controlling the selected end effector using the robot arm using a control unit such that the selected end effector is moved into a position according to a specification, preferably into a position relative to the registered patient.
- This control method can provide the surgeon with at least two different modalities, visualization and instrumentation.
- a preoperative plan can be read in, which includes, for example, CT-based trajectories for an instrument or for waypoints of a visualization.
- the patient can be registered with the preoperative images by a navigation camera of a navigation system.
- the end effector is selected as the selected end effector (target end effector), in particular an automatic selection based on a step currently to be carried out in the operation plan or by manual input.
- the position, in particular the location, of the visualization unit is recorded and in the "mode" of the visualization the robot moves the visualization unit to the predetermined waypoints, in particular according to the operation plan (the location of the end effector is set accordingly).
- the instrument unit is selected as the selected end effector (target end effector)
- the position, in particular the location of the instrument unit and thus also of the instrument is spatially recorded and tracked (in particular, the instrument tip is known through a known transformation of the instrument or instrument unit) and then in the "mode” of instrumentation the robot moves with the instrument unit according to a predetermined trajectory. Both alternatives again lead to an optional subsequent step of carrying out the intervention step.
- the surgeon can therefore select in which mode he currently wants to operate, whether he needs visualization and moves the visualization unit accordingly in the visualization mode (or lets it move automatically based on the preoperatively defined operation plan) or whether he needs an instrument and the instrument unit is the selected end effector and moves accordingly.
- the medical robot can have two robot arms, wherein a first robot arm is equipped with a visualization unit as an end effector for visualizing the surgical field and a further, second robot arm is equipped with an instrument unit as an end effector for guiding a medical, in particular surgical, instrument, which are connected to a single robot base.
- the two robot arms are mounted (i.e.
- both arms can each have a (localization) sensor with which the exact position, in particular the location of the respective end effector, can be localized (for example by means of a navigation camera and a tracked tracker as a (localization-location) sensor).
- the sensors can preferably be integrated into the robot arms or attached externally to the end effectors with trackers (such as optical markers).
- the visualization unit on one robot arm can also be used as a visual sensor to locate the end effector of the visualization unit.
- the two robot arms can be used one after the other (for example, first visualize, then instrument) or in parallel (for example, visualize and simultaneously instrument) or independently of one another (in particular only for visualization or only for instrumentation) and controlled accordingly by the control unit.
- the control unit can preferably be adapted so that the two robot arms work together.
- the visualization unit can, for example, recognize critical areas in space and locate the corresponding position in space, and the control unit can be adapted to control the instrument unit on the basis of these detected spatial critical areas (for example, define them as a "no-go zone" for guidance) in such a way that the instrument robot cannot collide with or against these areas.
- the disclosure in connection with the medical robot according to the present disclosure also applies to the medical robot according to the above independent aspect.
- the features can therefore be exchanged between these two variants.
- the visualization unit can preferably also be used to define surgical targets and locate their position in space so that the instrument unit is guided to the surgical target.
- Fig. 1 is a perspective view of a robot system comprising a robot with a robot arm and two end effectors according to a preferred embodiment, in which the two end effectors are in a rigid spatial relationship to each other;
- Fig. 2 is a perspective view of another embodiment of the robot system with the robot and a robot arm with the two end effectors, in which a joint is provided between the instrument unit and the visualization unit, which can be manually adjusted to increase the range and speed of target detection and to exchange instrument units without tools;
- Fig. 3 a system with two robot arms, one robot arm for visualization and one robot arm for instrumentation, in which a navigation camera is used as a tracking system to localize the position of the end effectors (according to an independently claimable aspect); and
- Fig. 4 is a flowchart of a control method according to an embodiment of the present disclosure.
- the figures are schematic in nature and are intended only to assist in understanding the invention. Identical elements are provided with the same reference numerals. The features of the various embodiments can be interchanged.
- Fig. 1 shows a schematic perspective view of a robot system 100 with a surgical collaborative robot 1 according to a first preferred embodiment for actuating two end effectors 2, 4, which is used during an examination or an intervention on a patient P.
- the robot 1 has a robot base 6 as a local connection point, in this case a medical rollable cart, which is only indicated schematically.
- a robot arm 8 with at least one robot arm segment 10 is movably and actuably mounted on the robot base 6.
- Two different end effectors 2, 4 are provided on the terminal free side of the robot arm 8.
- a first end effector 2 is a visualization unit 12 with a visualization axis 14 for visual (optical) recording and a second end effector 4 is a medical instrument unit 16 with a surgical instrument 20 and an associated instrument axis 18.
- a control unit 22 of the robot 1 is adapted to control the spatial position of the visualization unit 12 or the spatial position of the instrument 20 for an intervention by means of the actuable robot arm 8.
- the robot 1 has two end effectors 2, 4: one end effector 2 for visualization and one end effector for the instrument 20. Both end effectors 2, 4 are in a fixed spatial relationship to each other.
- the configuration according to the present disclosure allows two end effectors 2, 4 to be operated independently of one another using only one robot arm 8.
- This configuration is space-saving, safe, efficient and also easy to set up and maintain and enables intuitive operation by a surgeon.
- the surgeon only needs to keep an eye on one robot arm when it is operating in the surgical area and can carry out the desired surgical steps. carry out, for example, first set up an optical representation of a tissue that has to be treated, then bring the instrument 20 into the desired position for an (automated procedure), after the procedure, switch back to the optical representation in order to assess the surgical result.
- An instrument unit adapter 24 is arranged on the robot arm 8 and/or the visualization unit 12, to which the instrument unit 16 can be coupled and decoupled by means of a predefined, complementary counter-adapter 26, so that the instrument unit 16 with the instrument 20 can be removed and exchanged for another, different instrument 20'.
- the further instrument unit 16' with the second instrument 20' in the form of a scalpel can be exchanged for the first instrument 20.
- a medical specialist can select the appropriate instruments 20, 20' intraoperatively depending on the progress of the operation and depending on the need and couple them accordingly to the robot 1 (detachably and without tools).
- the visualization unit 12 has a rigid spatial relationship to the instrument unit 16.
- the control unit 22 is adapted to switch between the position of the visualization unit 12 and the position of the instrument unit 16 via a static transformation stored in a memory and to calculate the other position in each case. In this way, the position of the other end effector 2 can be controlled on the basis of the detected position of one end effector 4. Since a static spatial relationship exists, only a single tracker 28, which is designed in the form of a rigid body with optical markers, is required. This tracker 28 is rigidly attached to the instrument unit 16.
- the control device 22 is adapted to determine the position of both the instrument 20 and the visualization unit 12 by tracking the single tacker 28 and the predefined rigid spatial relation of the visualization unit 12 to the instrument unit 16 and to control the robot 1 accordingly via the robot arm 8.
- the robot 1 can thus carry out either a visualization modality (visualization unit 12 is controlled as an end effector) or an instrumentation modality via the tracker 28, depending on the selection. Since only one tracker 28 is used, a possible collision or obstruction is less likely than if multiple trackers 28 have to be used.
- the visualization unit 12 is a digital surgical microscope 30 and the instrument unit 16 is a guide sleeve 32 into which a minimally invasive tool can be inserted.
- the instrument axis 18 of the instrument 20 is aligned in such a way that it is parallel to the visualization axis 14 and thus does not intersect it.
- the instrument axis 18 serves to align and arrange the insertable tool, which the surgeon can operate manually, for example. The surgeon is thus supported collaboratively by the robot during his procedure.
- the medical robot 1 further comprises a navigation system 34 with at least one display device in the form of an operating room monitor 36, and the control unit 22 controls the robot 1 with the two end effectors 2, 4 on the basis of the navigation system 34.
- a visualization mode waypoints are approached for the visualization unit 12 as the selected end effector 2 and the position of the visualization unit 12 is set accordingly in relation to a patient P.
- the instrument unit 16 is defined as the selected end effector 4 on the basis of a preoperatively defined operation plan and a position of the instrument unit 16 as the selected end effector 4 and thus the position of the instrument 20 is set accordingly with the specification of the instrument tip.
- Fig. 2 shows a further embodiment of the robot system 100 and the robot 1 according to the present disclosure.
- the robot system 100 and the robot 1 as shown in Fig. 2 differ from that of Fig.
- a joint 38 is provided between the visualization unit 12 and the instrument unit 16 (as a serial connection).
- the joint 38 in the form of a ball joint can be adjusted manually, in particular to increase a range and a speed of target detection.
- the joint 38 is therefore initially used by the surgeon for the rough alignment and the surgeon can use the joint to manually adjust the spatial relationship between the visualization unit 12 and the instrument unit 16.
- the robot arm 8 When the joint 38 is locked, i.e. the ball is frictionally fixed in the socket, the robot arm 8 then carries out a fine alignment of the instrument 20, controlled by the control unit 22.
- the visualization and instrumentation unit have separate (navigation)! racker 28.
- the two axes 14, 18 are arranged such that the instrument axis 18 and the visualization axis 14 move away from each other starting from the two end effectors 2, 4, so that the instrument 20 does not impair the visualization of the visualization unit 12 and is not visible in the field of view of the visualization unit 12.
- Fig. 3 shows an example of an independently claimable aspect that can be claimed in another application.
- two robot arms are connected to a single robot base, with a first robot arm carrying the first end effector 2 (in the form of a visualization unit, here a surgical microscope) and being able to adjust its position in space accordingly, and another robot arm guiding the second end effector 4 (in the form of an instrument unit with an instrument, here a guide bushing) and being able to adjust the position of the instrument indirectly via the position of the instrument unit as the second end effector.
- first robot arm carrying the first end effector 2 (in the form of a visualization unit, here a surgical microscope) and being able to adjust its position in space accordingly
- another robot arm guiding the second end effector 4 (in the form of an instrument unit with an instrument, here a guide bushing) and being able to adjust the position of the instrument indirectly via the position of the instrument unit as the second end effector.
- the first end effector 2 in the form of a visualization unit, here a surgical microscope
- Fig. 4 shows a flow chart of a computer-implemented control method for a medical robot 1 for Actuation of a medical end effector 2, 4 during an examination or an intervention on a patient according to a preferred embodiment of the present disclosure.
- This control method can be used in particular with the robot 1 from Fig. 1 or 2.
- a preoperative plan is read in, which provides predefined waypoints and steps.
- registration S1 of the patient P is carried out using a navigation system 34.
- step S2 a selection S2 of either a visualization unit 12 with visualization axis 14 or an instrument unit 16 with an instrument 20 and an associated instrument axis 18 is carried out as the selected end effector, both of which are connected as end effectors 2, 4 to a robot arm 8 with at least one robot arm segment 10 of the robot 1, wherein the robot arm 8 is in turn connected to a robot base 6 as a local connection point.
- control procedure splits up depending on the selection (i.e. depending on the selected end effector).
- a position of the visualization unit 12 is tracked by the navigation system 34, in particular by a navigation camera of the navigation system 34 which tracks a tracker.
- a step S4a the selected end effector 2 is controlled by the control unit 22 using the robot arm 8 in such a way that the selected end effector 2, i.e. the visualization unit 12, is moved according to a specification into a predetermined position relative to the registered patient P, i.e. moves to a waypoint. An optical examination can then be carried out at this waypoint.
- a robot-assisted visualization is then carried out in this set position of the end effector 2.
- the instrument unit 16 with the instrument 20 is tracked accordingly in a step S3b and the control unit 22 again controls the position of the instrument 20 in a step S4b.
- a robot-assisted intervention can then be carried out in step S5, for example a robot-guided incision.
- the control method therefore enables the optional selection of one of the two end effectors 2, 4, which can then be used (in isolation). It is of course also possible for the two end effectors 2, 4 to work together, for example, that the visualization unit continuously scans a room and when an obstacle is detected in the room, the robot arm with the instrument is controlled in such a way that the instrument does not collide with the detected obstacle.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Robotics (AREA)
- Gynecology & Obstetrics (AREA)
- Radiology & Medical Imaging (AREA)
- Manipulator (AREA)
Abstract
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202480012872.1A CN120693123A (zh) | 2023-02-16 | 2024-02-07 | 具有不同端部效果器的医学机器人、机器人系统和用于医学机器人的控制方法 |
| EP24704720.2A EP4554507A1 (fr) | 2023-02-16 | 2024-02-07 | Robot médical avec différents effecteurs terminaux, système robotique et méthode de commande pour un robot médical |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102023103872.9 | 2023-02-16 | ||
| DE102023103872.9A DE102023103872A1 (de) | 2023-02-16 | 2023-02-16 | Medizinischer Roboter mit unterschiedlichen Endeffektoren, Robotersystem und Steuerverfahren für einen medizinischen Roboter |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024170384A1 true WO2024170384A1 (fr) | 2024-08-22 |
Family
ID=89905750
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2024/053072 Pending WO2024170384A1 (fr) | 2023-02-16 | 2024-02-07 | Robot médical avec différents effecteurs terminaux, système robotique et méthode de commande pour un robot médical |
Country Status (4)
| Country | Link |
|---|---|
| EP (1) | EP4554507A1 (fr) |
| CN (1) | CN120693123A (fr) |
| DE (1) | DE102023103872A1 (fr) |
| WO (1) | WO2024170384A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102010029275A1 (de) * | 2010-05-25 | 2011-12-01 | Siemens Aktiengesellschaft | Verfahren zum Bewegen eines Instrumentenarms eines Laparoskopierobotors in einer vorgebbare Relativlage zu einem Trokar |
| US20210085410A1 (en) * | 2019-09-19 | 2021-03-25 | Auris Health, Inc. | Coordinated movements of robotic tools |
| WO2021199979A1 (fr) * | 2020-03-30 | 2021-10-07 | ソニーグループ株式会社 | Dispositif, système et procédé de traitement d'informations |
| US11540887B2 (en) * | 2020-06-05 | 2023-01-03 | Stryker European Operations Limited | Technique for providing user guidance in surgical navigation |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2321010B1 (fr) * | 2008-08-07 | 2012-12-26 | University Of Rochester | Aide robotique à la localisation pour l'administration d'ultrasons focalisés de haute intensité |
| CA2987061C (fr) * | 2015-06-08 | 2024-01-02 | Covidien Lp | Dispositif de montage pour systemes chirurgicaux et procede d'utilisation |
| US12446969B2 (en) * | 2019-05-20 | 2025-10-21 | Icahn School Of Medicine At Mount Sinai | Robot mounted camera registration and tracking system for orthopedic and neurological surgery |
| US12185926B2 (en) * | 2020-10-14 | 2025-01-07 | Orthosoft Ulc | Quick connect for robotic surgery |
| US11759274B2 (en) * | 2021-03-24 | 2023-09-19 | Point Robotics (Singapore) Pte. Ltd. | Surgical device and method thereof |
| DE102021126484A1 (de) * | 2021-10-13 | 2023-04-13 | B. Braun New Ventures GmbH | Kalibrierungsverfahren zur automatisierten Kalibrierung von Kamera zu medizinischem Roboter und chirurgisches Assistenzsystem |
| DE102021130238A1 (de) * | 2021-11-18 | 2023-05-25 | B. Braun New Ventures GmbH | Medizinischer Roboter mit intuitiver Steuerung und Steuerungsverfahren |
-
2023
- 2023-02-16 DE DE102023103872.9A patent/DE102023103872A1/de active Pending
-
2024
- 2024-02-07 CN CN202480012872.1A patent/CN120693123A/zh active Pending
- 2024-02-07 EP EP24704720.2A patent/EP4554507A1/fr active Pending
- 2024-02-07 WO PCT/EP2024/053072 patent/WO2024170384A1/fr active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102010029275A1 (de) * | 2010-05-25 | 2011-12-01 | Siemens Aktiengesellschaft | Verfahren zum Bewegen eines Instrumentenarms eines Laparoskopierobotors in einer vorgebbare Relativlage zu einem Trokar |
| US20210085410A1 (en) * | 2019-09-19 | 2021-03-25 | Auris Health, Inc. | Coordinated movements of robotic tools |
| WO2021199979A1 (fr) * | 2020-03-30 | 2021-10-07 | ソニーグループ株式会社 | Dispositif, système et procédé de traitement d'informations |
| US11540887B2 (en) * | 2020-06-05 | 2023-01-03 | Stryker European Operations Limited | Technique for providing user guidance in surgical navigation |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP4554507A1 * |
Also Published As
| Publication number | Publication date |
|---|---|
| DE102023103872A1 (de) | 2024-08-22 |
| EP4554507A1 (fr) | 2025-05-21 |
| CN120693123A (zh) | 2025-09-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP2449997B1 (fr) | Poste de travail médical | |
| EP2575662B1 (fr) | Procédé de déplacement du bras porte-instruments d'un robot de laparoscopie dans une position relative prédéfinissable par rapport à un trocart | |
| DE69322202T2 (de) | System und Verfahren zur Verbesserung von endoskopischer Chirurgie | |
| EP3103409B1 (fr) | Dispositif de chirurgie assisté par un robot et procédé de positionnement du dispositif | |
| DE102007045075B4 (de) | Interventionelles medizinisches Diagnose- und/oder Therapiesystem | |
| DE102005044033B4 (de) | Positionierungssystem für perkutane Interventionen | |
| DE19649082C1 (de) | Vorrichtung zur Fernsteuerung eines Werkzeugs | |
| EP0677278B1 (fr) | Adapteur stéréotactique et procédé pour son opération | |
| EP3363358B1 (fr) | Dispositif de détermination et recouvrement d'un point de référence lors d'une intervention chirurgicale | |
| DE102007030137A1 (de) | Führung für chirurgische Werkzeuge | |
| WO2008058520A2 (fr) | Dispositif de génération d'images pour un opérateur | |
| WO2022162217A1 (fr) | Système d'assistance chirurgical à microscope opératoire et caméra et procédé de visualisation | |
| EP4447842A1 (fr) | Système de robot chirurgical et procédé de commande | |
| WO2012034886A1 (fr) | Procédé pour le placement d'un robot de laparoscopie dans une position relative pouvant être définie au préalable par rapport à un trocart | |
| EP4225537B1 (fr) | Procédé d'étalonnage pour l'étalonnage automatisé d'une caméra par rapport à un robot médical, et système d'assistance chirurgicale | |
| WO2023247444A1 (fr) | Robot de guidage laser servant à projeter visuellement un guide sur un plan de chirurgie, procédé de projection et système de robot de guidage laser | |
| WO2024170384A1 (fr) | Robot médical avec différents effecteurs terminaux, système robotique et méthode de commande pour un robot médical | |
| DE202005014582U1 (de) | Positionierungssystem für perkutane Interventionen | |
| DE102017223598B4 (de) | Verfahren zur Registrierung beim Einstellen einer Ausrichtung eines Instruments und Robotersystem | |
| DE102014210056A1 (de) | Verfahren zur Ansteuerung eines chirurgischen Geräts sowie chirurgisches Gerät | |
| DE202015009588U1 (de) | Motorisiertes vollfeldadaptives Mikroskop | |
| EP4208750B1 (fr) | Procédé de fonctionnement d'un système de microscopie et système de microscopie | |
| DE102023113045A1 (de) | Assistenzsystem, computerimplementiertes Verfahren zur Steuerung, sowie computerlesbares Speichermedium | |
| WO2024023102A1 (fr) | Système de navigation et procédé de navigation ayant une fonction d'annotation |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24704720 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024704720 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 2024704720 Country of ref document: EP Effective date: 20250213 |
|
| WWP | Wipo information: published in national office |
Ref document number: 2024704720 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 202480012872.1 Country of ref document: CN |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWP | Wipo information: published in national office |
Ref document number: 202480012872.1 Country of ref document: CN |