[go: up one dir, main page]

WO2025196696A1 - Controlling a surgical tool for performing microsurgical procedures in a robotic manner - Google Patents

Controlling a surgical tool for performing microsurgical procedures in a robotic manner

Info

Publication number
WO2025196696A1
WO2025196696A1 PCT/IB2025/052943 IB2025052943W WO2025196696A1 WO 2025196696 A1 WO2025196696 A1 WO 2025196696A1 IB 2025052943 W IB2025052943 W IB 2025052943W WO 2025196696 A1 WO2025196696 A1 WO 2025196696A1
Authority
WO
WIPO (PCT)
Prior art keywords
tool
control
component
surgical tool
surgical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IB2025/052943
Other languages
French (fr)
Inventor
Yaron LEVINSON
Yoav GOLAN
Daniel Glozman
Joseph Nathan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Forsight Robotics Ltd
Original Assignee
Forsight Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Forsight Robotics Ltd filed Critical Forsight Robotics Ltd
Publication of WO2025196696A1 publication Critical patent/WO2025196696A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/77Manipulators with motion or force scaling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image

Definitions

  • Some applications of the present disclosure generally relate to medical apparatus and methods. Specifically, some applications of the present disclosure relate to apparatus and methods for performing microsurgical procedures in a robotic manner.
  • Robotic surgery has significantly advanced the field of surgery by providing enhanced precision, control, and minimally invasive approaches to complex medical procedures.
  • robotic-assisted systems have evolved, with a strong focus on improving the accuracy, safety, and efficacy of surgeries across a wide range of medical specialties.
  • certain specialized surgical fields, particularly microsurgery still face challenges that limit the full potential of these systems.
  • Microsurgery involves performing highly precise surgical procedures on extremely small structures, such as blood vessels, nerves, or tissues, that are typically less than a few millimeters in size. These delicate operations require a combination of exceptional fine motor skills, stable hands, and superior depth perception, all while maintaining a clear and magnified view of the surgical site.
  • microsurgical procedures were primarily conducted by human surgeons using traditional hand-held instruments, relying heavily on the skill and experience of the operator. Despite the best efforts of surgeons, inherent limitations such as tremors, fatigue, and human error often result in compromised outcomes.
  • Cataract surgery involves the removal of the natural lens of the eye that has developed an opacification (known as a cataract), and its replacement with an intraocular lens. Such surgery typically involves a number of standard steps, which are performed sequentially.
  • the patient's face around the eye is disinfected (typically, with iodine solution), and the face is covered by a sterile drape, such that only the eye is exposed.
  • a sterile drape such that only the eye is exposed.
  • the eye is anesthetized, typically using a local anesthetic, which is administered in the form of liquid eye drops.
  • the eyeball is then exposed, using an eyelid speculum that holds the upper and lower eyelids open.
  • One or more (e.g., 2-3) incisions, typically including at least one larger incision having a three-planar form, are made in the cornea of the eye.
  • the incisions are typically made using a specialized blade, which is called a keratome blade.
  • anesthetic such as lidocaine
  • another anesthetic is injected into the anterior chamber of the eye via the corneal incisions.
  • the pupil is dilated, and a viscoelastic injection is applied via the corneal incisions.
  • the viscoelastic injection is performed in order to stabilize the anterior chamber and to help maintain eye pressure during the remainder of the procedure, and also in order to distend the lens capsule.
  • capsulorhexis In a subsequent stage, known as capsulorhexis, a part of the anterior lens capsule is removed, using one or more tools inserted via the corneal incisions.
  • Various enhanced techniques have been developed for performing capsulorhexis, such as laser-assisted capsulorhexis, zepto- rhexis (which utilizes precision nano-pulse technology), and marker-assisted capsulorhexis (in which the cornea is marked using a predefined marker, in order to indicate the desired size for the capsule opening).
  • a fluid wave to be injected via the corneal incisions, in order to dissect the cataract's outer cortical layer, in a step known as hydrodissection.
  • a subsequent step known as hydrodelineation
  • the outer softer epi-nucleus of the lens is separated from the inner firmer endo-nucleus by the injection of a fluid wave.
  • ultrasonic emulsification of the lens is performed, in a process known as phacoemulsification.
  • the nucleus of the lens is broken initially using a chopper, following which the outer fragments of the lens are broken and removed, typically using an ultrasonic phacoemulsification probe.
  • the remaining lens cortex i.e., the outer layer of the lens
  • viscoelastic material is aspirated from the cansule.
  • aspirated fluids are typically replaced with irrigation of a balanced salt solution, in order to maintain fluid pressure in the anterior chamber.
  • the capsule is polished. Subsequently, the intraocular lens (IOL) is inserted into the capsule.
  • the IOL is typically foldable and is inserted in a folded configuration, before unfolding inside the capsule. If necessary, one or more of the incisions are sealed by elevating the pressure inside the bulbus oculi (i.e., the globe of the eye), causing the internal tissue to be pressed against the external tissue of the incisions, such as to force closed the incisions.
  • a robotic system for performing a robotic surgical procedure (e.g., a microsurgical procedure, such as an intraocular surgical procedure) on a portion of a body of a subject.
  • the robotic system includes one or more robotic units configured to hold surgical tools, in addition to an imaging system, one or more displays, and a control-component unit, via which one or more operators (e.g., healthcare professionals, such as a physician and/or a nurse) control the robotic units.
  • the robotic system includes at least one computer processor, via which components of the system and operator(s) operatively interact with each other.
  • control-component unit includes one or more control components that are configured to correspond to respective robotic units of the robotic system.
  • the system may include first and second robotic units, and the control-component unit may include first and second control components.
  • each of the control components includes an arm that includes a plurality of links that are coupled to each other via joints (e.g. , rotational joints or linear joints), and a control-component tool coupled to the links.
  • the computer processor determines the XYZ location and orientation of a portion (e.g., the tip) of the control-component tool, and drives the corresponding robotic unit to move a portion (e.g., the tip) of the surgical tool held by the robotic unit so as to track any changes in this location and orientation.
  • the processor drives the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by the operator by applying a transformation to coordinates of a portion (e.g., the tip) of the control-component tool to compute corresponding coordinates of a portion (e.g., the tip) of the surgical tool.
  • the processor executes a process for engaging the control-component tool with the surgical tool, such that movement of the control-component tool by the operator causes corresponding movement of the surgical tool by the robotic unit as described above.
  • this process includes displaying an image of the portion of the body and the surgical tool, overlaying, on the image, an icon representing the controlcomponent tool, such that a location and orientation of the icon tracks a location and orientation of the control-component tool, and in response to an alignment, by the operator, of the icon with the surgical tool in the image, engaging the control-component tool with the surgical tool.
  • the operator may wish to temporarily disengage the control-component tool from the surgical tool, e.g., so as to move the control-component tool to a more convenient position within the workspace of the control-component tool.
  • the processor is configured to disengage the control-component tool from the surgical tool, in response to a first input from the operator, such that the movement of the control-component tool does not cause any movement of the surgical tool, and to re-execute the process for engaging the control-component tool with the surgical tool in response to a second input from the operator.
  • the operator provides the first input by pressing an input interface such as a button or foot pedal, and provides the second input by releasing the input interface.
  • the operator may bring the control-component tool or one of the links into contact with a surface such that the surface supports the control-component tool or the link.
  • the operator continues to control the surgical tool, using the control-component tool, while the surface supports the controlcomponent tool or the link.
  • the surface helps stabilize the control-component tool.
  • the processor in response to an input from the operator, increases the resistance of the control-component tool to forces applied to the control-component tool by the operator. In such applications, the processor typically increases the resistance of the control-component tool to forces applied to the controlcomponent tool by the operator, without disallowing movement of the control-component tool.
  • one surgical tool when one surgical tool is replaced with another surgical tool having a different preferred orientation (e.g., a different preferred pitch), the operator changes the orientation (e.g., the pitch) of the control-component tool.
  • orientation e.g., the pitch
  • this change in orientation typically causes the position of the tip of the control-component tool to change. If the transformation, via which the processor transforms the control-component-tool coordinates to the surgical-tool coordinates, were to remain the same, the coordinates of the surgical tool would undesirably jump.
  • the processor is configured to use different transformations for different tool types.
  • the processor is configured to identify, automatically or in response to a manual input, the type of surgical tool held by the robotic unit.
  • the processor selects a transformation from multiple predefined transformations.
  • the predefined transformations are configured to account for the different preferred orientations of different types of surgical tools, thereby minimizing the change in the coordinates of the surgical tool when one surgical tool is replaced with another.
  • control-component tool limits movement of the control-component tool with respect to a particular type of movement, such as translational movement in one or more directions, e.g., so as to keep the surgical tool away from a sensitive region of the body or to keep the surgical tool from moving laterally into the edge of an incision.
  • the processor is configured to receive an input, from the operator, indicating that at least one type of movement of the surgical tool should be limited, without other types of movement of the surgical tool being limited.
  • the processor limits movement of the control-component tool with respect to the type of movement but not with respect to the other types of movement.
  • apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool
  • the apparatus including: a control-component tool; and a processor, configured to: execute a process for engaging the control-component tool with the surgical tool such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit, by: displaying an image that includes a representation of the surgical tool, overlaying, on the image, an icon representing the control-component tool, such that a location and orientation of the icon tracks a location and orientation of the control-component tool, and in response to an alignment, by the operator, of the icon with the representation of the surgical tool in the image, engaging the control-component tool with the surgical tool, in response to a first input from the operator, disengage the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool, and in response to a second
  • the processor is configured to display the representation of the surgical tool by displaying an icon representing the surgical tool.
  • the processor is configured to display the representation of the surgical tool by displaying an image of the surgical tool.
  • the processor is configured to display the representation of the surgical tool by displaying an image of the portion of the body and the surgical tool.
  • the surgical tool includes an ophthalmic surgical tool.
  • the apparatus further includes multiple links that are coupled to each other via one or more joints and are coupled to the control-component tool
  • the processor is configured to re-execute the process for engaging the controlcomponent tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface, such that the surface supports the control-component tool or the one of the links when the control-component tool is re-engaged with the surgical tool.
  • the apparatus further includes an input interface, and the operator provides the first input by pressing the input interface and provides the second input by releasing the input interface.
  • the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
  • the input interface includes a button on the control-component tool.
  • a method for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool including: executing, by a processor, a process for engaging a control-component tool with the surgical tool such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit, by: displaying an image that includes a representation of the surgical tool, overlaying, on the image, an icon representing the control-component tool, such that a location and orientation of the icon tracks a location and orientation of the controlcomponent tool, and in response to an alignment, by the operator, of the icon with the surgical tool in the image, engaging the control-component tool with the surgical tool; in response to a first input from the operator, disengaging the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool; and in response to a second input from the operator, re-
  • displaying the image that includes the representation of the surgical tool includes displaying an icon representing the surgical tool.
  • displaying the image that includes the representation of the surgical tool includes displaying an image of the surgical tool.
  • displaying the image that includes the representation of the surgical tool includes displaying an image of the portion of the body and the surgical tool.
  • the surgical tool includes an ophthalmic surgical tool.
  • control-component tool is coupled to multiple links that are coupled to each other via one or more joints
  • re-executing the process for engaging the control-component tool with the surgical tool includes re-executing the process for engaging the control-component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface, such that the surface supports the control-component tool or the one of the links when the controlcomponent tool is re-engaged with the surgical tool.
  • the operator provides the first input by pressing an input interface and provides the second input by releasing the input interface.
  • the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
  • the input interface includes a button on the control-component tool.
  • a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, by: executing a process for engaging a control-component tool with the surgical tool such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit, by: displaying an image that includes a representation of the surgical tool, overlaying, on the image, an icon representing the control-component tool, such that a location and orientation of the icon tracks a location and orientation of the controlcomponent tool, and in response to an alignment, by the operator, of the icon with the surgical tool in the image, engaging the control-component tool with the surgical tool, in response to a first input from the operator, disengaging
  • displaying the image that includes the representation of the surgical tool includes displaying an icon representing the surgical tool.
  • displaying the image that includes the representation of the surgical tool includes displaying an image of the surgical tool.
  • displaying the image that includes the representation of the surgical tool includes displaying an image of the portion of the body and the surgical tool.
  • the surgical tool includes an ophthalmic surgical tool.
  • control-component tool is coupled to multiple links that are coupled to each other via one or more joints
  • re-executing the process for engaging the control-component tool with the surgical tool includes re-executing the process for engaging the control-component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface, such that the surface supports the control-component tool or the one of the links when the controlcomponent tool is re-engaged with the surgical tool.
  • the operator provides the first input by pressing an input interface and provides the second input by releasing the input interface.
  • the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
  • the input interface includes a button on the control-component tool.
  • an apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool including: multiple links coupled to each other via one or more joints; a control-component tool coupled to the links; and a processor, configured to: engage the control-component tool with the surgical tool, such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit, in response to a first input from the operator, disengage the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool, and in response to a second input from the operator, re-engage the control-component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface such that the surface supports the control-component tool or the one of the links.
  • the surface is horizontal.
  • the surface supports the control-component tool or the one of the links while a hand of the operator, which holds the control-component tool, is supported by another surface.
  • the processor is further configured to drive the robotic unit, following the re-engagement of the control-component tool with the surgical tool, to move the surgical tool correspondingly to the movement of the control-component tool while the surface supports the control-component tool or the one of the links.
  • the movement of the control-component tool is not constrained by the surface even while the surface supports the control-component tool or the one of the links.
  • control-component tool or the one of the links includes a compression spring and a retractable portion coupled to the compression spring, and the surface supports the control-component tool or the one of the links at the retractable portion, such that the retractable portion retracts, and the spring compresses, when the controlcomponent tool or the one of the links is pushed against the surface.
  • control-component tool is mounted to a top end of the one of the links, and the retractable portion is positioned at a bottom end of the one of the links.
  • the apparatus further includes the surface, the links are mounted to the surface.
  • the retractable portion is positioned at a tip of the controlcomponent tool.
  • the apparatus further includes the surface and a compression spring, the surface is mounted to the compression spring such that the surface retracts, and the spring compresses, when the control-component tool or the one of the links is pushed against the surface.
  • a method for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool including: engaging, by a processor, a control-component tool, which is coupled to multiple links that are coupled to each other via one or more joints, with the surgical tool, such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit; in response to a first input from the operator, disengaging the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool; and in response to a second input from the operator, re-engaging the control-component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface such that the surface supports the control-component tool or the one of the links.
  • the surface is horizontal.
  • the surface supports the control-component tool or the one of the links while a hand of the operator, which holds the control-component tool, is supported by another surface.
  • the method further includes, following the re-engagement of the control-component tool with the surgical tool, driving the robotic unit to move the surgical tool correspondingly to the movement of the control-component tool while the surface supports the control-component tool or the one of the links.
  • the movement of the control-component tool is not constrained by the surface even while the surface supports the control-component tool or the one of the links.
  • control-component tool or the one of the links includes a compression spring and a retractable portion coupled to the compression spring, and the surface supports the control-component tool or the one of the links at the retractable portion, such that the retractable portion retracts, and the spring compresses, when the controlcomponent tool or the one of the links is pushed against the surface.
  • control-component tool is mounted to a top end of the one of the links, and the retractable portion is positioned at a bottom end of the one of the links.
  • the links are mounted to the surface.
  • the retractable portion is positioned at a tip of the controlcomponent tool.
  • the surface is mounted to a compression spring such that the surface retracts, and the spring compresses, when the control-component tool or the one of the links is pushed against the surface.
  • a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, by: engaging a control-component tool, which is coupled to multiple links that are coupled to each other via one or more joints, with the surgical tool, such that movement of the controlcomponent tool by an operator causes corresponding movement of the surgical tool by the robotic unit, in response to a first input from the operator, disengaging the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool, and in response to a second input from the operator, re-engaging the control-component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface such that the surface supports the control-component tool or the one of the links
  • the surface is horizontal.
  • the surface supports the control-component tool or the one of the links while a hand of the operator, which holds the control-component tool, is supported by another surface.
  • the instructions further cause the processor to drive the robotic unit, following the re-engagement of the control-component tool with the surgical tool, to move the surgical tool correspondingly to the movement of the control-component tool while the surface supports the control-component tool or the one of the links.
  • the movement of the control-component tool is not constrained by the surface even while the surface supports the control-component tool or the one of the links.
  • control-component tool or the one of the links includes a compression spring and a retractable portion coupled to the compression spring, and the surface supports the control-component tool or the one of the links at the retractable portion, such that the retractable portion retracts, and the spring compresses, when the controlcomponent tool or the one of the links is pushed against the surface.
  • control-component tool is mounted to a top end of the one of the links, and the retractable portion is positioned at a bottom end of the one of the links.
  • the links are mounted to the surface.
  • the retractable portion is positioned at a tip of the control- component tool.
  • the surface is mounted to a compression spring such that the surface retracts, and the spring compresses, when the control-component tool or the one of the links is pushed against the surface.
  • an apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool including: a control-component tool; and a processor, configured to: identify a type of the surgical tool, in response to identifying the type, select a transformation from multiple predefined transformations, and drive the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by an operator, by applying the selected transformation to coordinates of the control-component tool to compute corresponding coordinates of the surgical tool.
  • the surgical tool includes an ophthalmic surgical tool.
  • the predefined transformations include respective translations that vary from each other.
  • the translations vary from each other along a vertical axis.
  • the processor is configured to apply the selected transformation to coordinates of a tip of the control-component tool to compute corresponding coordinates of a tip of the surgical tool.
  • the predefined transformations map different respective coordinates of the control-component tool to the same coordinates of the surgical tool.
  • the predefined transformations correspond to different respective orientations at which surgical tools are held during the surgical procedure.
  • the predefined transformations correspond to different respective pitches at which surgical tools are held during the surgical procedure.
  • the processor is configured to select a different one of the predefined transformations in response to the surgical tool being of a non-straight type, relative to the surgical tool being of a straight type.
  • the non-straight type is selected from the group of types consisting of: an ophthalmic chopper, and forceps.
  • the straight type is selected from the group of types consisting of: an intraocular injector, and a phacoemulsification handpiece.
  • the processor is further configured to, prior to identifying the type of the surgical tool: drive the robotic unit to move another surgical tool correspondingly to the movement of the control-component tool, by applying another one of the predefined transformations to the coordinates of the control-component tool to compute the corresponding coordinates of the surgical tool, and disengage the control-component tool from the other surgical tool, such that the movement of the control-component tool does not cause any movement of the other surgical tool, and the processor is configured to identify the type of the surgical tool in response to the surgical tool replacing the other surgical tool while the control-component tool is disengaged from the other surgical tool.
  • the application of the selected transformation reduces required movement of the control-component tool following the surgical tool replacing the other surgical tool, relative to if the other one of the predefined transformations were applied following the surgical tool replacing the other surgical tool.
  • a method for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool including: identifying, by a processor, a type of the surgical tool; in response to identifying the type, selecting a transformation, by the processor, from multiple predefined transformations; and driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator, by applying the selected transformation to coordinates of the control-component tool to compute corresponding coordinates of the surgical tool.
  • the surgical tool includes an ophthalmic surgical tool.
  • the predefined transformations include respective translations that vary from each other.
  • the translations vary from each other along a vertical axis.
  • applying the selected transformation includes applying the selected transformation to coordinates of a tip of the control-component tool to compute corresponding coordinates of a tip of the surgical tool.
  • the predefined transformations map different respective coordinates of the control-component tool to the same coordinates of the surgical tool.
  • the predefined transformations correspond to different respective orientations at which surgical tools are held during the surgical procedure.
  • the predefined transformations correspond to different respective pitches at which surgical tools are held during the surgical procedure.
  • selecting the transformation includes selecting a different one of the predefined transformations in response to the surgical tool being of a non-straight type, relative to the surgical tool being of a straight type.
  • the non-straight type is selected from the group of types consisting of: an ophthalmic chopper, and forceps.
  • the straight type is selected from the group of types consisting of: an intraocular injector, and a phacoemulsification handpiece.
  • the method further includes, prior to identifying the type of the surgical tool: driving the robotic unit to move another surgical tool correspondingly to the movement of the control-component tool, by applying another one of the predefined transformations to the coordinates of the control-component tool to compute the corresponding coordinates of the surgical tool; and disengaging the control-component tool from the other surgical tool, such that the movement of the control-component tool does not cause any movement of the other surgical tool, identifying the type of the surgical tool includes identifying the type of the surgical tool in response to the surgical tool replacing the other surgical tool while the control-component tool is disengaged from the other surgical tool.
  • the application of the selected transformation reduces required movement of the control-component tool following the surgical tool replacing the other surgical tool, relative to if the other one of the predefined transformations were applied following the surgical tool replacing the other surgical tool.
  • a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, by: identifying a type of the surgical tool, in response to identifying the type, selecting a transformation from multiple predefined transformations, and driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator, by applying the selected transformation to coordinates of the control-component tool to compute corresponding coordinates of the surgical tool.
  • the surgical tool includes an ophthalmic surgical tool.
  • the predefined transformations include respective translations that vary from each other.
  • the translations vary from each other along a vertical axis.
  • the instructions cause the processor to apply the selected transformation to coordinates of a tip of the control-component tool to compute corresponding coordinates of a tip of the surgical tool.
  • the predefined transformations map different respective coordinates of the control-component tool to the same coordinates of the surgical tool.
  • the predefined transformations correspond to different respective orientations at which surgical tools are held during the surgical procedure.
  • the predefined transformations correspond to different respective pitches at which surgical tools are held during the surgical procedure.
  • the instructions cause the processor to select a different one of the predefined transformations in response to the surgical tool being of a non-straight type, relative to the surgical tool being of a straight type.
  • the non- straight type is selected from the group of types consisting of: an ophthalmic chopper, and forceps.
  • the straight type is selected from the group of types consisting of: an intraocular injector, and a phacoemulsification handpiece.
  • the instructions further cause the processor to, prior to identifying the type of the surgical tool: drive the robotic unit to move another surgical tool correspondingly to the movement of the control-component tool, by applying another one of the predefined transformations to the coordinates of the control-component tool to compute the corresponding coordinates of the surgical tool, and disengage the control-component tool from the other surgical tool, such that the movement of the control-component tool does not cause any movement of the other surgical tool, and the instructions cause the processor to identify the type of the surgical tool in response to the surgical tool replacing the other surgical tool while the control-component tool is disengaged from the other surgical tool.
  • the application of the selected transformation reduces required movement of the control-component tool following the surgical tool replacing the other surgical tool, relative to if the other one of the predefined transformations were applied following the surgical tool replacing the other surgical tool.
  • an apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool including: a control-component tool; and a processor, configured to: drive the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by an operator, receive an input from the operator, and in response to the input, increase a resistance of the control -component tool to forces applied to the control-component tool by the operator, without disallowing the movement.
  • the apparatus further includes an input interface, the operator provides the input by pressing the input interface.
  • the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
  • the input interface includes a button on the control-component tool.
  • the surgical tool includes an ophthalmic surgical tool.
  • control-component tool is held by a non-dominant hand of the operator.
  • the apparatus further includes another control-component tool, the control-component tool is held by a first hand of the operator, and the processor is configured to increase the resistance while a second hand of the operator moves the other control-component tool.
  • the apparatus further includes: a plurality of links coupled to each other via one or more joints and coupled to the controlcomponent tool; and respective motors operatively coupled to the joints, the processor is configured to increase the resistance by increasing a counterforce applied to the joints by the motors.
  • a method for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool including: driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator; receiving an input from the operator; and in response to the input, increasing a resistance of the control-component tool to forces applied to the control-component tool by the operator, without disallowing the movement.
  • the operator provides the input by pressing an input interface.
  • the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
  • the input interface includes a button on the control-component tool.
  • the surgical tool includes an ophthalmic surgical tool.
  • control-component tool is held by a non-dominant hand of the operator.
  • control-component tool is held by a first hand of the operator, and increasing the resistance includes increasing the resistance while a second hand of the operator moves another control-component tool.
  • control-component tool is coupled to a plurality of links that are coupled to each other via one or more joints, respective motors are operatively coupled to the joints, and increasing the resistance includes increasing the resistance by increasing a counterforce applied to the joints by the motors.
  • a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, by: driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator, receiving an input from the operator, and in response to the input, increasing a resistance of the control-component tool to forces applied to the control-component tool by the operator, without disallowing the movement.
  • the operator provides the input by pressing an input interface.
  • the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
  • the input interface includes a button on the control-component tool.
  • the surgical tool includes an ophthalmic surgical tool.
  • control-component tool is held by a non-dominant hand of the operator.
  • control-component tool is held by a first hand of the operator, and increasing the resistance includes increasing the resistance while a second hand of the operator moves another control-component tool.
  • control-component tool is coupled to a plurality of links that are coupled to each other via one or more joints, respective motors are operatively coupled to the joints, and increasing the resistance includes increasing the resistance by increasing a counterforce applied to the joints by the motors.
  • an apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool including: a control-component tool; and a processor, configured to: drive the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by an operator, receive an input, from the operator, indicating that at least one type of movement of the surgical tool should be limited, without other types of movement of the surgical tool being limited, and in response to the input, limit the movement of the control-component tool with respect to the type of movement but not with respect to the other types of movement.
  • the apparatus further includes an input interface, and the operator provides the input by pressing the input interface.
  • the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
  • the input interface includes a button on the control-component tool.
  • the surgical tool includes an ophthalmic surgical tool.
  • the apparatus further includes: a plurality of links coupled to each other via one or more joints and coupled to the control-component tool; and respective motors operatively coupled to the joints, and the processor is configured to limit the movement by increasing a counterforce applied to the joints by the motors.
  • the processor is configured to limit the movement by disallowing the movement with respect to the type of movement.
  • the control -component tool prior to the limiting of the movement, has multiple degrees of freedom, and the processor is configured to limit the movement of the control-component tool with respect to at least one of the degrees of freedom.
  • control -component tool prior to the limiting of the movement, has six degrees of freedom.
  • the processor is configured to limit translational movement of the control-component tool in at least one direction without limiting any rotational movement of the control-component tool.
  • the direction is defined with respect to an orientation of the controlcomponent tool.
  • the direction is along a longitudinal axis of the control-component tool.
  • the processor is configured to limit all translational movement.
  • the processor is configured to inhibit the surgical tool from moving deeper into the portion of the body of the subject by limiting the movement.
  • the portion of the body includes an eye of the subject.
  • the processor is configured to inhibit the surgical tool from moving into an edge of an incision in the body of the subject by limiting the movement.
  • the incision includes a corneal incision.
  • a method for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool including: driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator; receiving an input, from the operator, indicating that at least one type of movement of the surgical tool should be limited, without other types of movement of the surgical tool being limited; and in response to the input, limiting the movement of the control-component tool with respect to the type of movement but not with respect to the other types of movement.
  • the operator provides the input by pressing an input interface.
  • the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
  • the input interface includes a button on the control-component tool.
  • the surgical tool includes an ophthalmic surgical tool.
  • control-component tool is coupled to a plurality of links that are coupled to each other via one or more joints, respective motors are operatively coupled to the joints, and limiting the movement includes limiting the movement by increasing a counterforce applied to the joints by the motors.
  • limiting the movement includes disallowing the movement with respect to the type of movement.
  • control -component tool prior to the limiting of the movement, has multiple degrees of freedom, and limiting the movement of the control-component tool includes limiting the movement of the control-component tool with respect to at least one of the degrees of freedom.
  • control -component tool prior to the limiting of the movement, has six degrees of freedom.
  • limiting the movement includes limiting translational movement of the control-component tool in at least one direction without limiting any rotational movement of the control-component tool.
  • the direction is defined with respect to an orientation of the controlcomponent tool.
  • the direction is along a longitudinal axis of the control-component tool.
  • limiting the translational movement includes limiting all translational movement.
  • the method includes, by limiting the movement, inhibiting the surgical tool from moving deeper into the portion of the body of the subject.
  • the portion of the body includes an eye of the subject.
  • the method includes, by limiting the movement, inhibiting the surgical tool from moving into an edge of an incision in the body of the subject.
  • the incision includes a corneal incision.
  • a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, by: driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator, receiving an input, from the operator, indicating that at least one type of movement of the surgical tool should be limited, without other types of movement of the surgical tool being limited, and in response to the input, limiting the movement of the control-component tool with respect to the type of movement but not with respect to the other types of movement.
  • an apparatus for performing a procedure on a portion of a body of a patient using a surgical tool that has a tip the apparatus being for use with a robotic unit configured to move the surgical tool
  • the apparatus including: a control-component unit that includes: a control-component tool that is configured to be moved by an operator and that defines a tip; and an inertial measurement unit including at least one location sensor selected from the group consisting of: a three-axis accelerometer, a three-axis gyroscope, and a three- axis magnetometer, the inertial measurement unit being configured to generate inertial- measurement-unit data indicative of an orientation of the tip of the control-component tool; and a computer processor configured to: determine a location and orientation of the tip of the control-component tool based upon the inertial-measurement-unit data received from the location sensor, drive the robotic unit to move the tip of the surgical tool within the portion of the body of the patient in a
  • a method for performing a procedure on a portion of a body of a patient using a surgical tool that has a tip the method being for use with: a robotic unit configured to move the surgical tool, and a control-component unit that includes: a control-component tool that is configured to be moved by an operator and that defines a tip, and an inertial measurement unit including at least one location sensor selected from the group consisting of: a three-axis accelerometer, a three-axis gyroscope, and a three- axis magnetometer, the inertial measurement unit being configured to generate inertial- measurement-unit data indicative of an orientation of the tip of the control-component tool, the method including: using a computer processor, determining a location and orientation of the tip of the control-component tool based upon the inertial-measurement-unit data received from the location sensor, driving the robotic unit to move the tip of the surgical tool within the portion of the body of the patient in
  • a computer software product for use with: a robotic unit configured to move a surgical tool that has a tip, and a control-component unit that includes: a control-component tool that is configured to be moved by an operator and that defines a tip, and an inertial measurement unit including at least one location sensor selected from the group consisting of: a three-axis accelerometer, a three-axis gyroscope, and a three- axis magnetometer, the inertial measurement unit being configured to generate inertial- measurement-unit data indicative of an orientation of the tip of the control-component tool, the computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a procedure on a portion of a body of a patient using the surgical tool, by: determining a location and orientation of the tip of the control-component tool based upon the inertial-measurement-unit data
  • an apparatus for performing a procedure on an eye of a patient using two or more tools including: a first control component configured to be operated by a first hand of an operator and a second control component configured to be operated by a second hand of the operator, the first control component and second control component being positioned in a given configuration with respect to each other; a first robotic unit configured to hold a first one of the two or more tools, and a second robotic unit being configured to hold a second one of the two or more tools, the first robotic unit and second robotic unit corresponding respectively to the first control component and second control component, the first robotic unit being positioned in a non-standard position with respect to the eye of the patient, so that the first robotic unit and second robotic unit are positioned with respect to each other in a configuration that is different from the given configuration; and at least one computer processor configured to process inputs into the each of the first control component and second control component such as to generate outputs at the corresponding robotic unit, the computer processor being configured to
  • the first robotic unit is configured to insert the first tool into the patient’s eye from an inferior position with respect to the patient’s eye.
  • the first robotic unit is configured to insert the first tool into the patient’s eye via an incision in a cornea of the patient’s eye that is configured to treat an astigmatism of the cornea.
  • Figs. 1A and IB are schematic illustrations of a robotic system that is configured for use in a microsurgical procedure, such as intraocular surgery, in accordance with some applications of the present disclosure
  • Fig 2A is a schematic illustration of a display showing a surgical tool placed laterally with respect to a patient’s cornea, in accordance with some applications of the present disclosure
  • Fig. 2B is a schematic illustration of a display showing an augmented surgical tool overlaid on the laterally-placed surgical tool, such that the tip of the augmented surgical tool is placed in a vicinity of the tip of the surgical tool, in accordance with some applications of the present disclosure;
  • FIGs. 2C and 2D are schematic illustrations of a display showing an augmented controlcomponent tool being positioned such as to overlay the augmented surgical tool in order to engage a first control-component tool with the robotic system, in accordance with some applications of the present disclosure
  • Fig. 2E is a schematic illustration of a display showing a surgical tool placed superiorly with respect to a patient’s cornea, in accordance with some applications of the present disclosure
  • Fig. 2F is a schematic illustration of a display showing an augmented surgical tool overlaid on the superiorly-placed surgical tool, such that the tip of the augmented surgical tool is placed in a vicinity of the tip of the surgical tool, in accordance with some applications of the present disclosure
  • Figs. 2G and 2H are schematic illustrations of a display showing an augmented controlcomponent tool being positioned such as to overlay the augmented surgical tool in order to engage a second control-component tool with the robotic system, in accordance with some applications of the present disclosure
  • Fig. 21 is a schematic illustration of a display showing an augmented control-component tool being moved away from an augmented surgical tool in order to disengage a controlcomponent tool from the robotic system, in accordance with some applications of the present disclosure
  • Fig. 3A is a schematic illustration of the robotic system marked with cuboids that indicate workspaces of a control component and a surgical tool for illustrative purposes, in accordance with some applications of the present disclosure
  • Fig. 3B is a schematic illustration of the robotic system annotated with several frames of references for illustrative purposes, in accordance with some applications of the present disclosure
  • Figs. 4A, 4B, 4C, and 4D are schematic illustrations of a control component of a controlcomponent unit, in accordance with some applications of the present disclosure
  • FIGs. 5A and 5B are schematic illustrations of a workstation, in accordance with some embodiments of the present disclosure.
  • Fig. 6 schematically illustrates use of a tool-type- specific transformation, in accordance with some embodiments of the present disclosure.
  • Fig. 7 is flow diagram for controlling surgical tools using tool-type- specific transformations, in accordance with some embodiments of the present disclosure.
  • Figs. 1A and IB are schematic illustrations of a robotic system 10 that is configured for use in a microsurgical procedure, such as intraocular surgery, in accordance with some applications of the present disclosure.
  • robotic system 10 includes one or more robotic units 20, which are configured to hold tools 21, and an imaging system 22.
  • System 10 further includes one or more displays 24 and a control-component unit 26 (e.g., a control-component unit that includes a pair of control components, as shown in the enlarged portion of Fig. 1A), which are typically located at a workstation 84.
  • a control-component unit 26 e.g., a control-component unit that includes a pair of control components, as shown in the enlarged portion of Fig. 1A
  • robotic system 10 includes one or more computer processors 28, via which components of the system and operator(s) 25 operatively interact with each other.
  • Figs. 1A and IB show different setups of a robotic system 10 that is configured for ophthalmic surgery.
  • first and second robotic units are disposed at respective lateral positions (i.e., left and right) with respect to the eye that is being operated on, such that tools 21 that are held by the robotic units are disposed at approximately 180 degrees from each other.
  • the configuration shown in Fig. IB shows a first robotic unit that is placed laterally with respect to the eye and a second robotic unit positioned in a superior position with respect to the eye, such that tools 21 that are held by the robotic units are disposed at approximately 90 degrees from each other. (In the context of ophthalmic procedures, the lateral position shown in Fig.
  • the first robotic unit is placed laterally with respect to the eye and the second robotic unit positioned in an inferior position with respect to the eye, such that tools 21 that are held by the robotic units are disposed at approximately 90 degrees from each other.
  • the scope of the present disclosure includes using any number of robotic units placed at any number of respective positions in relation to the patient, and the configurations shown in Figs. 1A and IB should not be interpreted as limiting the scope of the disclosure in any way.
  • movement of the robotic units is at least partially controlled by one or more operators (e.g., healthcare professionals, such as a physician 25A and/or a nurse 25B).
  • operators e.g., healthcare professionals, such as a physician 25A and/or a nurse 25B
  • the operator may receive images of the patient's eye and the robotic units and/or tools disposed therein, via display 24.
  • images are acquired by imaging system 22.
  • imaging system 22 is a stereoscopic imaging device and display 24 is a stereoscopic display. Based on the received images, the operator typically performs steps of the procedure.
  • Figs. 1A and IB show physician 25A providing commands to the robotic units via control-component unit 26, while viewing images of the patient’s eye and tools 21 upon display 24.
  • commands include commands that control the position and/or orientation of tools that are disposed within the robotic units, and/or commands that control actions that are performed by the tools.
  • the commands may control a blade, a phacoemulsification tool (e.g., the operation mode and/or suction power of the phacoemulsification tool), forceps (e.g., opening and closing of forceps), an intraocular-lensmanipulator tool (e.g., such that the tool manipulates the intraocular lens inside the eye for precise positioning of the intraocular lens within the eye), and/or injector tools (e.g., which fluid (e.g., viscoelastic fluid, saline, etc.) should be injected, and/or at what flow rate).
  • the operator may input commands that control the imaging system (e.g., the zoom, focus, orientation, and/or XYZ positioning of the imaging system).
  • control-component unit includes one or more control components 30 that are configured to correspond to respective robotic units 20 of the robotic system.
  • the system may include first and second robotic units, and the control -component unit may include first and second control components, as shown.
  • each of the control components is an arm 31 that includes a plurality of links that are coupled to each other via joints.
  • the control-components include respective control-component tools 32 (that are typically configured to replicate the robotic units), as shown in Fig. 1A.
  • the computer processor determines the XYZ location and orientation of the tip of the control -component tool 32, and drives the robotic unit such that the tip of the actual tool 21 that is being used to perform the procedure tracks the movements of the tip of the control-component tool and such that changes in the orientation of tool 21 track changes in the orientation of the control-component tool.
  • movement of the control-component tool by the operator is scaled up or down by the computer processor, as described in further detail hereinbelow.
  • tool 21 is described herein, in the specification and in the claims, as a “surgical tool.” This term is used in order to distinguish tool 21 from control-component tool 32, and should not be interpreted as limiting the type of tool that may be used as tool 21 in any way.
  • the term “surgical tool” should be interpreted to include any one the tools described herein and or any other types of tools that may occur to a person of ordinary skill in the art upon reading the present disclosure.
  • the surgical tool is an ophthalmic tool, e.g., one of the ophthalmic tools described hereinabove.
  • the right control component controls movement of the surgical tool that is toward the right of the patient’ s head when viewing the patient from a superior position (and which would normally be controlled by the physician’s right hand)
  • the left control component controls movement of the surgical tool that is toward the left of the patient’s head when viewing the patient from a superior position (and which would normally be controlled by the physician’s left hand).
  • the computer processor determines the XYZ location and orientation of the tip of the control-component tool 32, and drives the robotic unit such that the tip of the actual tool 21 that is being used to perform the procedure tracks the movements of the tip of the control-component tool and such that changes in the orientation of tool 21 track changes in the orientation of the control-component tool.
  • the computer processor typically changes the orientation of the surgical tool to correspond with the change in the orientation of the control-component tool.
  • control-component tool becomes engaged with the surgical tool of the robotic unit (such that the movements of the control-component tool control movement of the surgical tool) with the orientations of the surgical tool and the control-component tool being substantially similar to each other. If the orientations of the surgical tool and the controlcomponent tool are dissimilar from each other, this can lead to the operator being disoriented, which may in turn lead to discomfort, extended surgical durations, and erroneous movements due to the operator’s disorientation.
  • the operator e.g., physician 25A assumes and relinquishes control of the surgical tool (via the control-component tool) multiple times during a procedure, especially when the procedure requires the use of multiple tools that are changed during surgery (as is typically the case with surgical procedures, as described above).
  • the operator assumes control of the surgical tool
  • the operator performs surgical actions with the surgical tool
  • the robotic system or an operator removes the surgical tool from the robotic unit
  • the robotic system or the operator places a new surgical tool on the robotic unit, and steps (a)-(e) are repeated.
  • the control of the surgical tool by the operator has limitations.
  • the workspace in which the operator can move the control-component tool (referred to hereinafter “the control-component workspace”) is typically physically constrained by where it is comfortable or even possible for the operator to move the control-component tool.
  • the workspace of the surgical tool (referred to hereinafter “the tool workspace”) is typically physically constrained by the space within which it is possible for the robotic arm to move the surgical tool.
  • each of the control-component tools has a respective control-component workspace
  • each of the surgical tools has a respective tool workspace.
  • one of the limitations on the control-component workspace for one of the control-component tools is that it impinges on the control-component workspace of a second control-component tool.
  • one of the limitations on the tool workspace for one of the surgical tools is that it impinges on the tool workspace of a second one of the surgical tools.
  • the control-component workspace should be such that the control-component tool has sufficient freedom of movement such as to have the ability to control movement of the surgical tool within the surgical space. If the operator assumes control of the surgical tool (via the control- component tool) when the control-component tool is close to the edge of the control-component workspace, the movement of the control-component tool (and therefore that of the surgical tool) will be limited. Therefore, it is typically preferable for the operator to engage the controlcomponent tool with the surgical tool when the control-component tool is positioned and oriented such that the operator has good freedom of movement of the control-component tool.
  • the tool workspace should ideally cover the space within which the tool is expected to be manipulated for the purpose of the surgery (hereinafter “the surgical space”). If the operator assumes control of the surgical tool (via the control-component tool) when the surgical tool is at the edge of the tool workspace, the movement of the surgical tool will be limited. Therefore, it is typically preferable for the operator to engage the control-component tool with the surgical tool when the surgical tool is positioned and oriented such that it has good freedom of movement.
  • the operator should be able to freely move the surgical tool to all positions and orientation within the surgical space, without the control-component tool reaching the limits of the control-component workspace and without the surgical tool reaching the limits of the tool workspace.
  • the control-component tool becomes engaged with the surgical tool (such that the movements of the control-component tool control movement of the surgical tool) with the orientation of the surgical tool and the controlcomponent tool being substantially similar to each other, thereby avoiding disorientation of the operator.
  • the control-component tool becomes engaged with the surgical tool, when the surgical tool and the control-component tool are toward the centers of the controlcomponent workspace and the tool workspace, respectively.
  • the operator is able to engage the surgical tool and the control-component tool to each other and/or disengage the surgical tool and the control-component tool from each other using standard movements of the controlcomponent tool and without requiring additional external inputs.
  • Fig 2A is a schematic illustration of display 24 showing surgical tool 21 placed laterally with respect to a patient’s cornea 38, in accordance with some applications of the present disclosure.
  • the process for engaging the control-component tool with the surgical tool begins with the display of an image of the eye and the surgical tool on display 24, as shown in Fig. 2A.
  • physician 25A views the image on display 24.
  • the imaging system acquires anterior images of the patient’s eye, since the imaging system is typically disposed above the patient’s eye.
  • display 24 is a three-dimensional stereoscopic display.
  • the imaging system is configured to acquire images that encompass the entire surgical space and display 24 displays the acquired images, such that any manipulation of the surgical tools occurs within the field of view that is displayed on display 24.
  • the robotic system is configured to automatically place the surgical tool in position in the vicinity of the patient’s cornea (e.g., based upon images that are acquired by imaging system 22).
  • the operator e.g., nurse 25B
  • the surgical tool is positioned in the vicinity of the patient’ s cornea (by the robotic system and/or by the nurse) at a relatively central position within the tool workspace, and/or such that from this initial position within the tool workspace there is no movement of the control-component tool within the control-component workspace that would result in the tool being moved out of the tool workspace.
  • starting from this position typically allows completion of a step of the procedure that is to be performed by the surgical tool, without the need to reposition the robotic unit during the step of the procedure.
  • the computer processor identifies the surgical tool and generates an augmented surgical tool 40 (i.e., an augmented image of a surgical tool) that overlays the image of the surgical tool itself.
  • augmented surgical tool 40 facilitates identification of the surgical tool by the physician.
  • an augmented surgical tool is not displayed.
  • the physician provides an input to the computer processor indicating whether or not she/he would like an augmented surgical tool to be displayed, and the computer processor controls the image that is displayed to the physician based on the input.
  • FIGs. 2C and 2D are schematic illustrations of display 24 showing an icon 42 being positioned such as to overlay augmented surgical tool 40 in order to engage a first control-component tool with the robotic system, in accordance with some applications of the present disclosure.
  • Figs. 2C and 2D show an image of the patient’s eye as well as surgical tool 21 and augmented surgical tool 40
  • the scope of the present disclosure includes displaying an image that includes any representation of the surgical tool, in order to facilitate the engagement of a control-component tool with the robotic system.
  • the display may display only an icon representing the surgical tool (e.g., an augmented image of a surgical tool, e.g., augmented surgical tool 40), or may display an image of the surgical tool itself, or may display an image of the surgical tool and the patient’s eye, or may display an icon representing the surgical tool (e.g., an augmented image of a surgical tool, e.g., augmented surgical tool 40) overlaid on the patient’s eye.
  • an icon representing the surgical tool e.g., an augmented image of a surgical tool, e.g., augmented surgical tool 40
  • the processor overlays, on the image, an icon 42 representing the control-component tool.
  • Icon 42 is typically a virtual representation of the control-component tool, i.e., icon 42 is typically a computer-generated graphic that appears similar to the control-component tool. (Hence, icon 42 is also referred to herein as an “augmented controlcomponent tool.”) Alternatively, icon 42 is an image of the control-component tool.
  • the imaging system acquires anterior images of the patient’s eye, since the imaging system is typically disposed above the patient’s eye.
  • the view of the patient’s eye that is displayed by the display is as if the eye is being viewed from a superior position relative to the patient’s head, since this is the view that a physician is accustomed to seeing during ophthalmic surgery.
  • Display 24 typically faces the physician’s face and the orientation and position of icon 42 on display 24 are rotated in accordance with the view of the eye that is shown by the display.
  • the orientation of icon 42 within the frame of reference of the display is substantially similar to the orientation of the control -component tool within the control-component workspace frame of reference.
  • the absolute position of the control-component tool is typically unrelated to the absolute position of the icon.
  • the location and orientation of icon 42 tracks the location and orientation of the controlcomponent tool, i.e., movements of the control-component tool by the physician generate corresponding movements of the icon on display 24.
  • rotation of the controlcomponent tool through angular rotations generates corresponding rotations of the icon on display 24, and translational movement (along the X, Y, or Z directions) of the control-component tool generates corresponding translational motion of the icon on display 24.
  • the translational motion of the icon on display 24 is scaled up or down relative to the translational motion of the control-component tool.
  • the physician aligns the icon with the surgical tool in the image, i.e., the physician moves the control-component tool such that the icon is aligned with the image of surgical tool 21 and/or augmented surgical tool 40.
  • the icon is considered aligned with the surgical tool when (a) the tip of the icon overlays the image of surgical tool 21 and/or augmented surgical tool 40, and (b) the orientation of the icon is substantially similar to that of the surgical tool.
  • the processor engages the control-component tool with the surgical tool.
  • the computer processor is configured to position the icon such that when the physician aligns the icon with the surgical tool in the image, the control-component tool itself is disposed relatively centrally within the control-component workspace, and/or the controlcomponent tool can be moved, from this initial position within the control-component workspace, such as to move the surgical tool to any location within the tool workspace, without the controlcomponent tool leaving the control-component workspace.
  • the computer processor drives the robotic unit to adjust the position and/or orientation of the surgical tool such as to complete the alignment.
  • the augmented surgical tool and/or the icon is removed from the image that is displayed on display 24.
  • movements of the control-component tool by the physician generate corresponding movements of the surgical tool.
  • rotation of the controlcomponent tool through angular rotations roll, pitch, and/or yaw
  • translational movement of the control-component tool (along the X, Y, or Z directions) generates corresponding translational motion of the surgical tool.
  • the translational motion of the surgical tool is scaled up or down relative to the translational motion of the control-component tool.
  • FIGS. 2E-2H are schematic illustrations of generally similar steps to those described with reference to Figs. 2A-D being performed with respect to a second surgical tool 21 that is placed superiorly with respect to a patient’s cornea 38, in accordance with some applications of the present disclosure.
  • the view of the patient’s eye that is displayed on the display is as if the eye is being viewed from a superior position relative to the patient’s head, since this is the view that a physician is accustomed to seeing during ophthalmic surgery.
  • Display 24 typically faces the physician’s face and the orientation and position of the icon on display 24 are rotated in accordance with the view of the eye that is shown by the display.
  • the second surgical tool 21 that is placed superiorly with respect to a patient’s cornea 38 appears at the bottom of the display in the view shown in Fig. 2E.
  • the robotic system is configured to automatically place the second surgical tool in position in the vicinity of the patient’s cornea (e.g., based upon images that are acquired by imaging system 22).
  • the operator e.g., nurse 25B
  • the surgical tool is positioned in the vicinity of the patient’s cornea (by the robotic system and/or by the nurse) at a relatively central position within the tool workspace, and/or such that from this initial position within the tool workspace there is no movement of the control-component tool within the controlcomponent workspace that would result in the tool being moved out of the tool workspace.
  • starting from this position typically allows completion of a step of the procedure that is to be performed by the second surgical tool, without the need to reposition the second robotic unit during the step of the procedure.
  • an augmented second surgical tool 44 is overlaid upon the image of the second surgical tool in order to facilitate identification of the surgical tool by the physician, as shown in Fig. 2F.
  • an augmented second surgical tool is not displayed.
  • the physician provides an input to the computer processor indicating whether or not she/he would like an augmented second surgical tool to be displayed, and the computer processor controls the image that is displayed to the physician based on the input.
  • Second icon 46 represents the second control-component tool, by virtue of being a virtual representation or an actual image of the second control-component tool.
  • the second icon is considered aligned with the second surgical tool when (a) the tip of the second icon overlays the image of second surgical tool 21 and/or augmented second surgical tool 44, and (b) the orientation of the second icon is substantially similar to that of the second surgical tool.
  • the processor engages the second control-component tool with the second surgical tool.
  • the computer processor is configured to position the second icon such that when the physician aligns the second icon with the second surgical tool in the image, the second controlcomponent tool itself is disposed relatively centrally within the control -component workspace, and/or the second control-component tool can be moved, from this initial position within the control-component workspace, such as to move the second surgical tool to any location within the tool workspace, without the second control-component tool leaving the control-component workspace.
  • the computer processor drives the second robotic unit to adjust the position and/or orientation of the second surgical tool such as to complete the alignment.
  • the augmented second surgical tool 44 and/or the second icon is removed from the image that is displayed on display 24.
  • movements of the second controlcomponent tool by the physician generate corresponding movements of the second surgical tool.
  • rotation of the second control-component tool through angular rotations roll, pitch, and/or yaw
  • translational movement of the second control-component tool (along the X, Y, or Z directions) generates corresponding translational motion of the second surgical tool.
  • the translational motion of the second surgical tool is scaled up or down relative to the translational motion of the second control-component tool.
  • Fig. 21 is a schematic illustration of display 24 showing icon 42 being moved away from augmented surgical tool 40 in order to disengage control- component tool 32 from surgical tool 21, in accordance with some applications of the present disclosure.
  • control-component tool 32 in order to disengage control-component tool 32 from surgical tool 21, the control-component tool is moved toward an edge of the control-component workspace and/or control-component tool is moved such that the surgical tool is moved toward the edge of the tool workspace.
  • the computer processor generates an indication of the disengagement before, during, and/or after the disengagement.
  • graphical elements 48 e.g., stars, crosses, circles, highlights, or other graphical elements
  • a circle or an ellipse (not shown) is displayed around the iris (or at a different location), and the computer processor is configured to interpret a portion of the tool (such as the tip) exiting the circle as an indication that the operator wishes to disengage the control-component tool from the surgical tool.
  • the size of the circle or ellipse is typically selected such that it is visible within the field of view, but there would typically be no reason to move the portion of the tool outside of the circle for the purpose of the surgery.
  • the operator e.g., physician 25A
  • the primary purpose of the engagement process is to ensure that before the operator begins to control the surgical tool, the control-component tool has approximately the same orientation as the surgical tool.
  • the control-component tool need not be at any particular location in the control-component workspace.
  • the operator may wish to briefly disengage the control-component tool from the surgical tool, in order to move the control-component tool to a new location within control-component workspace, e.g., for greater comfort.
  • the system allows the operator to disengage the control-component tool from the surgical tool even without moving the control-component tool as described above, simply by providing another input.
  • the processor In response to the input, the processor disengages the control-component tool from the surgical tool, such that movement of the control-component tool does not cause any movement of the surgical tool. Subsequently, in response to another input from the operator, the processor re- executes the process for engaging the control-component tool with the surgical tool, which is described above with reference to Figs. 2A-H.
  • the operator provides the first input (for disengagement) by pressing an input interface, such as foot pedal 27 or button 33 (Fig. 1 A), and provides the second input (for re-engagement) by releasing the input interface.
  • an input interface such as foot pedal 27 or button 33 (Fig. 1 A)
  • the operator continues to press the input interface while moving the control-component tool in its disengaged state.
  • the operator releases the input interface.
  • the control-component tool is re-engaged with the surgical tool after the operator brings the controlcomponent tool, or one of links to which the control-component tool is coupled, into contact with a surface, such that the surface supports the control-component tool or the link when the controlcomponent tool is re-engaged with the surgical tool.
  • the surface then continues to stabilize the control-component tool while the control-component tool is held by the operator.
  • control-component tool typically becomes engaged with the surgical tool of the robotic unit, it does not disengage unless the physician performs a disengagement step (e.g., as described with reference to Fig. 21 or by providing another input, such as by pressing an input interface, as described above).
  • a disengagement step e.g., as described with reference to Fig. 21 or by providing another input, such as by pressing an input interface, as described above.
  • the control-component tool is disengaged from the surgical tool during a procedure, even without an input from the physician.
  • the control-component tool is disengaged from the surgical tool based on the robotic system detecting an imminent collision of tool with each other or with a portion of the patient’s body.
  • physician re-engages the controlcomponent tool with the surgical tool by performing the steps described hereinabove.
  • Fig. 3 A is a schematic illustration of the robotic system annotated with cuboids that indicate a control-component workspace 60 and a tool workspace 62 for illustrative purposes, in accordance with some applications of the present disclosure.
  • control-component workspace 60 i.e., the workspace in which the operator can move the control-component tool
  • tool workspace 62 i.e., the workspace of the surgical tool
  • the surgical tool is typically physically constrained by the space within which it is possible for the robotic arm to move the surgical tool.
  • Control-component workspace 60 should be such that the control-component tool has sufficient freedom of movement such as to have the ability to control movement of the surgical tool within the surgical space. If the operator assumes control of the surgical tool (via the controlcomponent tool) when the control-component tool is close to the edge of the control-component workspace, the movement of the control-component tool (and therefore that of the surgical tool) will be limited. Therefore, it is typically preferable for the operator to engage the controlcomponent tool with the surgical tool when the control-component tool is positioned and oriented such that the operator has good freedom of movement of the control-component tool.
  • Tool workspace 62 should ideally cover the space within which the tool is expected to be manipulated for the purpose of the surgery (hereinafter “the surgical space”). If the operator assumes control of the surgical tool (via the control-component tool) when the surgical tool is at the edge of the tool workspace, the movement of the surgical tool will be limited. Therefore, it is typically preferable for the operator to engage the control-component tool with the surgical tool when the surgical tool is positioned and oriented such that it has good freedom of movement.
  • the operator should be able to freely move the surgical tool to all positions and orientation within the surgical space, without the control-component tool reaching the limits of the control-component workspace and without the surgical tool reaching the limits of the tool workspace.
  • the control-component tool becomes engaged with the surgical tool, when the surgical tool and the control-component tool are toward the centers of the control-component workspace and the tool workspace, respectively.
  • control-component workspace has different dimensions from the tool workspace.
  • the tool workspace may be smaller than the control-component workspace, as shown.
  • movement of surgical tool 21 by robotic unit 20 are scaled down relative to movements of control-component tool 32.
  • Fig. 3B is a schematic illustration of robotic system 10 annotated with several frames of references for illustrative purposes, in accordance with some applications of the present disclosure.
  • the display displays an image that corresponds to “superior” surgery - with the image that is shown to the physician being as if the physician is facing the patient’s face, such that the patient’s chin faces up and forehead faces down.
  • the right robotic unit moves surgical tool to the left (within right robotic unit frame of reference F5), and the surgical tool is moved to the right on display 24 (within display frame of reference Fl).
  • the display displays images captured by imaging system, with such images being acquired within the imaging system frame of reference F3.
  • movement of the left control-component tool causes the left robotic unit to move the left surgical tool (within left robotic unit frame of reference F4).
  • control-component unit 26 is physically attached to the same body as display 24, such that there is a rigid and constant transformation from the control-component frame of reference F2 to the display frame of reference Fl.
  • the imaging system frame of reference F3 can be moved with respect to the patient. For example, the imaging system can be rotated such that it is as if the physician is viewing the eye with the superior and lateral (i.e., temporal) directions being reversed.
  • movements of the control-component tools within frame of reference F2 generates movement of the surgical tools in the same direction within display frame of reference Fl.
  • the computer processor typically receives inputs that are indicative of the orientations of the various frames of reference relative to each other.
  • the computer processor analyzes images of the robotic units within images acquire by the imaging system in order to determine the orientations of the various frames of reference relative to each other.
  • the computer processor receives an input from the operator indicating the orientation of surgery that she/he would like to be displayed on the display.
  • the right control component controls movement of the surgical tool that is toward the right of the patient’s head when viewing the patient from a superior position (and which would normallv be controlled by the physician’s right hand)
  • the left control component controls movement of the surgical tool that is toward the left of the patient’ s head when viewing the patient from a superior position (and which would normally be controlled by the physician’s left hand).
  • the right control component controls movement of the surgical tool that is toward the left of the patient’s head when viewing the patient from a superior position (and which would normally be controlled by the physician’s left hand)
  • the left control component controls movement of the surgical tool that is toward the right of the patient’s head when viewing the patient from a superior position (and which would normally be controlled by the physician’s right hand).
  • the physician may switch which control component controls which tool.
  • the computer processor or the physician may determine that it is easier for the surgical tools to perform their designated functions while staying within their ranges of motion by using the left control-component tool to control the right surgical tool and vice versa, and the computer processor may drive the robotic units to function accordingly (based upon the automatic detection by the computer processor or based upon an input from the physician).
  • the computer processor converts inputs from the physician regarding movements and actions that are provided within the right control-component tool frame of reference to corresponding movements and actions of the left surgical tool by the left robotic unit within the left robotic unit frame of reference, and/or vice versa.
  • the computer processor or the physician determines that it will be easier for one of the surgical tools to perform its designated function by being inserted into the eye from an inferior position (i.e., by being inserted from above the patient’s cheek). For some such applications, based upon the automatic detection by the computer processor or based upon an input from the physician indicating that this is the case, the computer processor drives a selected one of the robotic units to insert the surgical tool from the inferior position.
  • the computer processor receives inputs from the physician regarding movements and actions to perform using the inferiorly-inserted surgical tool, based on the operator controlling the control-component tool of the corresponding control component while the control-component is disposed at its regular orientation.
  • the computer processor converts inputs from the physician regarding movements and actions that are provided within either the right or left controlcomponent tool frame of reference to corresponding movements and actions of the inferiorly- inserted surgical tool by the selected robotic unit within the frame of reference of the selected robotic unit.
  • the robotic surgery is performed by forming one incision in the patient’s cornea at a circumferential position that is not a standard position for incisions that are made during typical cataract surgery.
  • an incision may be made at a non-standard circumferential position around the patient’s cornea, in order to correct an astigmatism of the patient, at the same time as providing an insertion point (or region) for the surgical tool(s).
  • Such incisions may include limbal relaxing incisions and/or a clear corneal incision, both of which are techniques that are known in the art and are typically customized to the corneal topography of the particular patient.
  • a clear corneal incision is typically aligned with the steep axis of cornea.
  • the tools are inserted and operated from non-standard positions.
  • the control-component units typically remain in their regular positions with respect to the imaging system and/or other components of the system (e.g., with respect to each other).
  • the computer processor drives a selected one of the robotic units to insert its surgical tool(s) from the non-standard position.
  • the computer processor receives inputs from the physician regarding movements and actions to perform using the non-standardly-inserted surgical tool, based on the operator controlling the control-component tool of a selected one of the control components while the control-component is disposed at its regular position.
  • the computer processor converts inputs from the physician regarding movements and actions that are provided within the selected control-component tool frame of reference to corresponding movements and actions of the non-standardly-inserted surgical tool by the selected robotic unit within the frame of reference of the selected robotic unit.
  • FIGS. 4A, 4B, 4C, and 4D are schematic illustrations of control component 30, in accordance with some applications of the present disclosure.
  • Fig. 4A and 4B show respective oblique views of the control component
  • Fig. 4C shows a side view
  • Fig. 4D shows a top view.
  • Control component 30 includes control-component tool 32 and multiple links 54 to which the control-component tool is coupled.
  • Links 54 are coupled to each other via one or more joints 47, such that the links provide multiple decrees of freedom to the control -component tool.
  • links 54 provide six degrees of freedom, including three translational degrees of freedom and three rotational degrees of freedom, to the control-component tool.
  • links 54 include a frame 50, which in some embodiments includes two curved arms. Frame 50 is configured to rotate around a first rotational axis 52X. Links 54 further include a shaft 53, to which control-component tool 32 is mounted, and one or more supporting links 55, e.g., a pair of parallel supporting links 55, which couple shaft 53 to frame 50 and rotate around a second rotational axis 52Y and around a third rotational axis 52Z. As the operator moves the control-component tool within the X- Y plane, frame 50 rotates about rotational axis 52X and supporting links 55 rotate about rotational axis 52Y. As the operator moves the control-component tool along the Z linear direction, supporting links 55 rotate about rotational axis 52Z.
  • supporting links 55 rotate about rotational axis 52Z.
  • control-component tool 32 is moveable by the operator to undergo pitch, yaw, and roll angular rotations.
  • the control-component tool undergoes pitch angular rotation by rotating about a pitch rotational axis 70 at the joint 47 between the control-component tool and shaft 53, yaw angular rotation by shaft 53 rotating about its own axis 72 (which functions as the yaw rotational axis), and roll angular rotation by rotating about its own longitudinal axis 74 (which functions as the roll rotational axis).
  • control component 30 includes at least three rotary encoders, which are disposed at different respective joints 47.
  • the rotary encoders detect the rotation of the links about the rotational axes, and generate signals in response thereto.
  • an inertial measurement unit 76 is housed within the control-component tool.
  • the inertial measurement unit includes a three-axis accelerometer, a three-axis gyroscope, and/or a three-axis magnetometer.
  • the inertial measurement unit generates an inertial-measurement-unit signal describing the three-dimensional orientation of the control-component tool.
  • control component includes one or more additional rotary encoders to detect the roll, pitch and/or yaw orientation of control-component tool 32.
  • Computer processor 28 (Fig. 1A) receives the rotary-encoder signals and the inertial-measurement-unit signal, and computes the XYZ location and orientation of the tip 58 of control-component tool 32 based on these signals.
  • the operator may wish to stabilize the control-component tool such that a greater amount of force is required to move the control-component tool, relative to the amount of force that is usually required. For example, while one hand of the operator moves one of the control-component tools, it may be difficult for the onerator to avoid accidentally moving the other control-component tool, which is held by the other hand of the operator. Alternatively or additionally, it may be difficult for the operator to avoid accidentally moving a control-component tool held by the operator’s non-dominant hand.
  • some embodiments allow the operator to provide an input, e.g., by pressing an input interface, such as foot pedal 27 or button 33 (Fig. 1 A).
  • the processor increases the resistance of the control-component tool to forces applied to the control-component tool by the operator, without disallowing movement of the controlcomponent tool.
  • the control-component tool is stabilized against accidental movement, yet by applying enough force, the operator can still move the control-component tool if necessary.
  • respective motors are operatively coupled to joints 47, and the processor increases the resistance to the forces applied by the operator by increasing the (translational and/or rotational) counterforce applied to the joints by the motors.
  • the motors are direct-drive motors (i.e., motors that do not impart motion via gear wheels).
  • the motors are linear motors, e.g., linear voice coil motors.
  • FIGs. 4A-D show control component 30 including motors 56X, 56Y, and 56Z.
  • Motor 56X (and, optionally, an extension 56XE thereof) is coupled to an angled extension 50E of frame 50.
  • Motor 56Y (and, optionally, an extension 56YE thereof) passes between the two curved arms of frame 50, such that motor 56Y (and, optionally, extension 56YE) is coupled to an angled extension 55E of a supporting link 55.
  • Motor 56Z is disposed at the end of supporting links 55 that is opposite from shaft 53.
  • the operator may wish to limit (e.g., disallow) at least one type of movement of the surgical tool, without limiting other types of movement of the surgical tool.
  • the operator may wish to inhibit the surgical tool from moving deeper into the eye (or, for other types of surgery, any other portion of the body) or from moving into the edge of the corneal incision (or, for other types of surgery, any other incision in the body), but still allow the surgical tool to translate in other directions and/or to rotate.
  • some embodiments allow the operator to provide an input, e.g., by pressing an input interface, such as foot pedal 27 or button 33 (Fig. 1A).
  • the processor is configured to receive the input, and to interpret the input as indicating that at least one type of movement of the surgical tool should be limited, without other types of movement of the surgical tool being limited.
  • the processor limits (e.g., disallows) the movement of the control-component tool with respect to the type of movement but not with respect to the other types of movement.
  • the processor limits the movement of the controlcomponent tool with respect to at least one of the multiple (e.g., six) degrees of freedom of the control-component tool.
  • the processor limits all translational movement (i.e., limits the movement with respect to all three translational degrees of freedom) without limiting any rotational movement of the control-component tool.
  • the point along a surgical tool via which a tool is inserted into the eye via a corneal incision functions as a virtual pivot point, in that rotational movements of the corresponding control tool cause the surgical tool to pivot with respect to this point (in order to prevent tearing of the corneal incision), as described in co-assigned US Patent Application Publication 2023/0240779 to Golan, whose disclosure is incorporated herein by reference.
  • the processor limits the movement of the control-component tool with respect to at least one of the multiple (e.g., six) degrees of freedom of the control-component tool in order to help the operator maintain the pivot point about which the surgical tool rotates.
  • the processor limits translational movement of the controlcomponent tool in at least one direction without limiting any rotational movement of the controlcomponent tool. For example, as described immediately above, in some embodiments, the processor limits all translational movement. Alternatively, for example, the processor limits translational movement along only one or two of the X-, Y-, and Z-axes. Alternatively or additionally, the processor limits translational movement in only one direction along any of the X-, Y-, and Z-axes.
  • the limited direction is defined with respect to the orientation of the control-component tool.
  • the processor limits translational movement of the control-component tool along longitudinal axis 74 of the control-component tool (at least in the direction of tip 58) or perpendicular to axis 74. This can help the operator, for example, to avoid moving the surgical tool deeper into the eye or into the edge of the corneal incision.
  • the processor limits the movement of the control-component tool by increasing the (translational and/or rotational) counterforce applied to the rotational joints by the motors, as described above.
  • the processor is configured to engage control-component tool 32 with the surgical tool, using the method described above with reference to Figs. 2A-H or any other suitable method. Following the engagement, movement of the control-component tool by the operator causes corresponding movement of the surgical tool by the robotic unit. As described above in the context of Fig. 21, in some embodiments, the processor is further configured to disengage controlcomponent tool 32 from the surgical tool in response to a first input from the operator, such that movement of the control-component tool does not cause any movement of the surgical tool. Following the disengagement, in some embodiments, the operator brings the control-component tool or one of links 54 into contact with a surface, such as a horizontal surface, such that the surface supports the control-component tool or the link. For example, in some embodiments, the operator brings shaft 53 into contact with the surface 78 to which links 54 are mounted.
  • the processor re-engages the controlcomponent tool with the surgical tool, using the method described above with reference to Figs. 2A-H or any other suitable method.
  • the surface then stabilizes the controlcomponent tool while the control-component tool is held by the operator.
  • This technique for stabilizing the control-component tool can be used alternatively or additionally to increasing the resistance of the control-component tool to forces applied to the control-component tool by the operator.
  • some embodiments allow manipulation of the controlcomponent tool following the re-engagement, i.e., the processor drives the robotic unit to move the surgical tool correspondingly to movement of the control-component tool while the surface supports the control-component tool or the link.
  • the movement of the control-component tool is not constrained by the surface even while the surface supports the control-component tool or the link, such that the operator retains full control over the surgical tool.
  • the element of control component 30 that is supported by the surface includes a compression spring 79 and a retractable portion 80 coupled to compression spring 79.
  • the surface supports the element at retractable portion 80, such that the retractable portion retracts and spring 79 compresses, as indicated by a retraction indicator 81, when the element is pushed against the surface, as indicated by a pushing indicator 82.
  • retractable portion 80 is nositioned at the bottom end of shaft 53.
  • retractable portion 80 is positioned at tip 58 of the control-component tool.
  • tip 58 is supported by a surface 86 or 87 of workstation 84 (Fig. 5A).
  • control component 30 including compression spring 79
  • the surface is mounted to a compression spring such that the surface retracts, and the spring compresses, when the element of control component 30 is pushed against the surface.
  • FIG. 5A and 5B are schematic illustrations of workstation 84, in accordance with some embodiments of the present disclosure.
  • Fig. 5B shows the workstation with surface 87 removed, such that links 54 and surface 78, upon which the links are mounted, are visible.
  • surface 87 typically covers surface 78 and links 54, while shafts 53 support respective control-component tools 32 such that the control-component tools hover above surface 86 and surface 86.
  • workstation 84 typically includes control components 30 and at least one display 24. The operator sits (or stands) behind the control components, such that the control components are between the operator and the display, while facing the display.
  • workstation 84 further includes a respective docking station 88 for each control-component tool 32. When the control component is not in use, the operator docks the control-component tool in docking station 88.
  • the lower enlarged frame in Fig. 5A shows one of the control-component tools in its docked position.
  • workstation 84 includes a surface 86 positioned behind the control components. The operator sits (or stands) behind surface 86 and reaches over surface 86 to hold the control-component tools, such that surface 86 supports the operator’s hands.
  • inertial measurement unit 76 typically includes a three-axis accelerometer, a three-axis gyroscope, and/or a three-axis magnetometer.
  • the accelerometers directly measure acceleration
  • the gyroscopes directly measure angular velocity
  • the magnetometers measure magnetic fields.
  • the combination of the above measurements is used to infer the orientation of the inertial-measurement unit.
  • algorithms generally use the earth’s gravitational pull as a known acceleration, which is fused with the gyroscope measurements integrated over time to infer the inertial measurement unit’s orientation. Without continuous correction of the orientation by the gravitational acceleration vector, the orientation output for the inertial measurement unit tends to drift. This is because angular information is derived from the gyroscopes through numerical integration. Any error in the angular velocity measurement accumulates over time, eventually resulting in a very poor estimate of the true orientation. The gravitational acceleration creates a “ground truth” that is used to remove the drift. If the gravitational vector did not change orientation, a change in angle derived from the gyroscopes can be ignored.
  • a “ground truth” vector is derived that is linearly independent from the gravitational vector.
  • the controlcomponent tools are docked in a given predetermined orientation, which is typically not vertical.
  • a sensor 92 such as a switch, a photo-reflector, etc. identifies when the control-component tool is docked.
  • the computer processor recalibrates the inertial measurement unit, based on two ground truth vectors - the gravitational vector and the known orientation of the control-component tool.
  • the inertial measurement unit is recalibrated such that any drift is corrected.
  • the computer processor performs the recalibration of the inertial measurement unit using the following algorithm.
  • the inertial measurement unit transmits the roll axis position that it is detecting to the computer processor.
  • the roll axis is the axis that typically drifts due to lack of gravitational information.
  • the roll axis position is cast onto the horizontal plane (normal to gravity), and compared to the known, true tilt angle; that is, the angle at which the control-component tool is known to lie within the horizontal plane. Whatever difference there is between the measured orientation and the true orientation is subtracted from the measured orientation.
  • the inertial measurement unit s measurement is corrected to fit the true orientation the controlcomponent tool is known to be in, and the correction is maintained until the next time the controlcomponent tool is docked, at which point the correction is repeated.
  • FIG. 6 schematically illustrates use of a tool-type- specific transformation, in accordance with some embodiments of the present disclosure.
  • Fig. 6 shows a straight surgical tool 21a and a non-straight surgical tool 21b.
  • surgical tool 21a the functional portion of the tool, which includes the tip 94a of the tool, is generally parallel to the handle of the tool.
  • non-straight surgical tool 21b the functional portion of the tool, which includes the tip 94b of the tool, is not parallel to the handle of the tool.
  • the processor is configured to drive the robotic unit to move the surgical tool held by the robotic unit correspondingly to movement of control-component tool 32 by an operator.
  • the processor continually applies a transformation, such as an affine transformation, to the coordinates of the control-component tool, to compute corresponding coordinates of the surgical tool.
  • the processor then drives the robotic unit to move the surgical tool to the corresponding coordinates.
  • the transformed coordinates are those of tip 58, and the corresponding coordinates are those of the tip of the surgical tool.
  • the transformed coordinates are those of another portion of the control-component tool, and the corresponding coordinates are those of another, corresponding portion of the surgical tool.
  • surgical tools of different types are preferably held at different orientations, e.g., at different pitches, during a surgical procedure.
  • non-straight surgical tools are preferably held at a greater pitch than straight tools, i.e., the angle of the handle of the tool with respect to the horizontal is preferably greater for non-straight surgical tools, relative to straight surgical tools.
  • Fig. 6 shows surgical tool 21a held at a pitch of 91, and surgical tool 21b held at a pitch of 92 > 91.
  • control-component tool 32 typically holds control-component tool 32 at the preferred orientation of the surgical tool controlled by the control-component tool.
  • Fig. 6 shows the control-component tool held at pitch 91 when controlling tool 21a, and at pitch 92 when controlling tool 21b.
  • the operator changes the orientation (e.g., the pitch) of the controlcomponent tool.
  • this change in orientation typically causes the position of tip 58 (or alternatively, another portion of the control-component tool whose coordinates are transformed) to change.
  • the corresponding coordinates of the surgical tool would undesirably change. For example, when switching from straight surgical tool 21a to non-straight surgical tool 21b, the height of tip 58 decreases, such that the corresponding height of tip 94b would also decrease. Conversely, when switching from non-straight surgical tool 21b to straight surgical tool 21a, the height of tip 58 increases, such that the corresponding height of tip 94b would also increase.
  • the processor is configured to use different transformations for different tool types.
  • the processor is configured to identify, automatically or in response to a manual input, the type of surgical tool held by the robotic unit.
  • the processor selects a transformation from multiple predefined transformations, which may be affine or may be of any other suitable type.
  • the processor drives the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by the operator, by applying the selected transformation to the coordinates of the control-component tool to compute the corresponding coordinates of the surgical tool.
  • the predefined transformations are configured to minimize the change in the corresponding coordinates of the surgical tool, by mapping different respective coordinates of the control-component tool to the same coordinates of the surgical tool.
  • the predefined transformations include respective translations that vary from each other, e.g., along the vertical axis.
  • the processor applies an affine transformation Az + b to the coordinate vector z of the controlcomponent tool, where A is an invertible matrix and b is a translation vector.
  • A is the same for all the predefined transformations, b is unique for each predefined transformation.
  • Fig. 6 shows the processor using a first transformation T1 for straight surgical tool 21a, but a different, second transformation T2 for non-straight surgical tool 21b.
  • First transformation T1 has a greater vertical translation, relative to second transformation T2.
  • the predefined transformations correspond to different respective orientations, e.g., pitches, at which surgical tools are held during the surgical procedure.
  • first transformation T1 corresponds to pitch 01
  • second transformation T2 corresponds to pitch 92.
  • the processor selects a different one of the predefined transformations in response to the surgical tool being of a non-straight type, relative to the surgical tool being of a straight type, due to the different preferred orientations for these types.
  • non-straight types include an ophthalmic chopper, and forceps.
  • non-straight tool types include a keratome blade, a paracentesis knife, and/or a syringe (e.g., a dispersive ophthalmic viscosurgical device (OVD) syringe, a cohesive ophthalmic viscosurgical device (OVD) syringe, a staining syringe, a lidocaine syringe, a hydrodissection syringe, and/or an antibiotics syringe).
  • a syringe e.g., a dispersive ophthalmic viscosurgical device (OVD) syringe, a cohesive ophthalmic viscosurgical device (OVD) syringe, a staining syringe, a lidocaine syringe, a hydrodissection syringe, and/or an antibiotics syringe.
  • straight types include an intraocular injector and a phaco
  • FIG. 7 shows a flow diagram 96 for an algorithm executed by processor 28 (Figs. 1A-B) to control the surgical tools using tool-type-specific transformations, in accordance with some embodiments of the present disclosure.
  • the processor moves a first surgical tool using a first predefined transformation, i.e., the processor drives the robotic unit to move the first surgical tool correspondingly to the movement of the control-component tool, by applying a first transformation to the coordinates of the control-component tool to compute the corresponding coordinates of the first surgical tool.
  • the processor disengages the control-component tool from the first surgical tool, e.g., as described above with reference to Fig. 21, at a disengaging step 100, such that the movement of the control-component tool does not cause any movement of the first surgical tool.
  • a second surgical tool replaces the first surgical tool while the control-component tool is disengaged from the first surgical tool.
  • the processor identifies the type of the second surgical tool, at a type-identifying step 102, in response to the second surgical tool replacing the first surgical tool. (As noted above, this identification can be automatic or in response to a manual input.)
  • the processor at a transformation-selecting step 104, selects a second predefined transformation in response to the type of the second surgical tool.
  • the processor engages the control-component tool to the second surgical tool, e.g., as described above with reference to Figs. 2A-H, at an engaging step 106.
  • the processor moves the second surgical tool using the second predefined transformation, i.e., the processor drives the robotic unit to move the second surgical tool correspondingly to the movement of the control-component tool, by annlvine the second transformation to the coordinates of the control-component tool to compute the corresponding coordinates of the second surgical tool.
  • the application of the second transformation reduces the required movement of the control-component tool following the second surgical tool replacing the first surgical tool, relative to if the first transformation were applied following the second surgical tool replacing the first surgical tool.
  • the first surgical tool is straight and the second surgical tool is non-straight, or vice versa. If the same transformation were applied for both tools, the control-component tool might need to be moved a significant vertical distance following the replacement, as described above with reference to Fig. 6. On the other hand, by virtue of using different transformations, this required distance is reduced.
  • the scope of the present application includes applying the apparatus and methods described herein to other medical procedures, mutatis mutandis.
  • the apparatus and methods described herein to other medical procedures may be applied to other microsurgical procedures, such as general surgery, orthopedic surgery, gynecological surgery, otolaryngology, neurosurgery, oral and maxillofacial surgery, plastic surgery, podiatric surgery, vascular surgery, and/or pediatric surgery that is performed using microsurgical techniques.
  • the imaging system includes one or more microscopic imaging units.
  • Such procedures may include collagen crosslinking, endothelial keratoplasty (e.g., DSEK, DMEK, and/or PDEK), DSO (descemet stripping without transplantation), laser assisted keratoplasty, keratoplasty, LASIK/PRK, SMILE, pterygium, ocular surface cancer treatment, secondary IOL placement (sutured, transconjunctival, etc.), iris repair, IOL reposition, IOL exchange, superficial keratectomy, Minimally Invasive Glaucoma Surgery (MIGS), limbal stem cell transplantation, astigmatic keratotomy, Limbal Relaxing Incisions (LRI), amniotic membrane transplantation (AMT), glaucoma surgery (e.g., trabs, tubes, minimally invasive glaucoma surgery), automated lamella
  • a computer-usable or computer-readable medium can be any apparatus that can include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
  • the computer-usable or computer readable medium is a non- transitory computer-usable or computer readable medium.
  • Examples of a computer-readable medium include a semiconductor or solid-state memory, magnetic tape, a removable computer diskette, a random-access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk.
  • Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W), DVD, and a USB drive.
  • a data processing system suitable for storing and/or executing program code will include at least one processor (e.g., computer processor 28) coupled directly or indirectly to memory elements through a system bus.
  • the memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
  • the system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments of the disclosure.
  • Network adapters may be coupled to the processor to enable the processor to become coupled to other processors or remote printers or storage devices through intervening private or public networks.
  • Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
  • Computer program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the C programming language or similar programming languages.
  • object-oriented programming language such as Java, Smalltalk, C++ or the like
  • conventional procedural programming languages such as the C programming language or similar programming languages.
  • These computer program instructions may also be stored in a computer-readable medium (e.g., a non-transitory computer-readable medium) that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the algorithms.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the algorithms described in the present application.
  • Computer processor 28 is typically a hardware device programmed with computer program instructions to produce a special purpose computer. For example, when programmed to perform the algorithms described with reference to the Figures, computer processor 28 typically acts as a special purpose robotic-system computer processor. Typically, the operations described herein that are performed by computer processor 28 transform the physical state of a memory, which is a real physical article, to have a different magnetic polarity, electrical charge, or the like depending on the technology of the memory that is used. For some applications, operations that are described as being performed by a computer processor are performed by a plurality of computer processors in combination with each other.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

Apparatus and methods are described for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit (20) holding a surgical tool (21). A processor (28) executes a process for engaging a control-component tool (32) with the surgical tool (21) such that movement of the control-component tool (32) by an operator causes corresponding movement of the surgical tool (21) by the robotic unit (20). In response to a first input from the operator, the processor (28) disengages the control-component tool (32) from the surgical tool (21), such that the movement of the control-component tool (32) does not cause any movement of the surgical tool (21), and in response to a second input from the operator, the processor re-executes the process for engaging the control-component tool (32) with the surgical tool (21). Other applications are also described.

Description

CONTROLLING A SURGICAL TOOL FOR PERFORMING MICROSURGICAL PROCEDURES IN A ROBOTIC MANNER
CROSS-REFERENCES TO RELATED APPLICATIONS
The present application claims priority from U.S. Provisional Patent Application No. 63/568,216 to Nathan, filed March 21, 2024, entitled "Engagement of microsurgical robotic system," and U.S. Provisional Patent Application No. 63/698,118 to Levinson, filed September 24, 2024, entitled "Controlling a surgical tool," the disclosures of which are incorporated herein by reference.
FIELD OF EMBODIMENTS
Some applications of the present disclosure generally relate to medical apparatus and methods. Specifically, some applications of the present disclosure relate to apparatus and methods for performing microsurgical procedures in a robotic manner.
BACKGROUND
Robotic surgery has significantly advanced the field of surgery by providing enhanced precision, control, and minimally invasive approaches to complex medical procedures. Over the past few decades, robotic-assisted systems have evolved, with a strong focus on improving the accuracy, safety, and efficacy of surgeries across a wide range of medical specialties. However, despite the tremendous advancements in robotic surgery, certain specialized surgical fields, particularly microsurgery, still face challenges that limit the full potential of these systems.
Microsurgery involves performing highly precise surgical procedures on extremely small structures, such as blood vessels, nerves, or tissues, that are typically less than a few millimeters in size. These delicate operations require a combination of exceptional fine motor skills, stable hands, and superior depth perception, all while maintaining a clear and magnified view of the surgical site. In the past, microsurgical procedures were primarily conducted by human surgeons using traditional hand-held instruments, relying heavily on the skill and experience of the operator. Despite the best efforts of surgeons, inherent limitations such as tremors, fatigue, and human error often result in compromised outcomes.
Recent advancements in robotic systems have introduced various forms of assistance to overcome these challenges, including improved visualization, remote control, and the use of advanced actuators for fine motor movements. However, the adoption of robotic systems in microsurgery is still in its nascent stages due to the unique demands of microsurgical procedures. Specifically, the difficulty in maintaining constant, precise instrument control over extended periods and the ability to perform multiple movements with extreme dexterity remain significant barriers to the successful application of robotic systems in microsurgery.
Cataract surgery involves the removal of the natural lens of the eye that has developed an opacification (known as a cataract), and its replacement with an intraocular lens. Such surgery typically involves a number of standard steps, which are performed sequentially.
In an initial step, the patient's face around the eye is disinfected (typically, with iodine solution), and the face is covered by a sterile drape, such that only the eye is exposed. When the disinfection and draping has been completed, the eye is anesthetized, typically using a local anesthetic, which is administered in the form of liquid eye drops. The eyeball is then exposed, using an eyelid speculum that holds the upper and lower eyelids open. One or more (e.g., 2-3) incisions, typically including at least one larger incision having a three-planar form, are made in the cornea of the eye. The incisions are typically made using a specialized blade, which is called a keratome blade. Subsequently, another anesthetic, such as lidocaine, is injected into the anterior chamber of the eye via the corneal incisions. Following this step, the pupil is dilated, and a viscoelastic injection is applied via the corneal incisions. The viscoelastic injection is performed in order to stabilize the anterior chamber and to help maintain eye pressure during the remainder of the procedure, and also in order to distend the lens capsule.
In a subsequent stage, known as capsulorhexis, a part of the anterior lens capsule is removed, using one or more tools inserted via the corneal incisions. Various enhanced techniques have been developed for performing capsulorhexis, such as laser-assisted capsulorhexis, zepto- rhexis (which utilizes precision nano-pulse technology), and marker-assisted capsulorhexis (in which the cornea is marked using a predefined marker, in order to indicate the desired size for the capsule opening).
Subsequently, it is common for a fluid wave to be injected via the corneal incisions, in order to dissect the cataract's outer cortical layer, in a step known as hydrodissection. In a subsequent step, known as hydrodelineation, the outer softer epi-nucleus of the lens is separated from the inner firmer endo-nucleus by the injection of a fluid wave. In the next step, ultrasonic emulsification of the lens is performed, in a process known as phacoemulsification. The nucleus of the lens is broken initially using a chopper, following which the outer fragments of the lens are broken and removed, typically using an ultrasonic phacoemulsification probe. When the phacoemulsification is complete, the remaining lens cortex (i.e., the outer layer of the lens) and viscoelastic material is aspirated from the cansule. During the phacoemulsification and the aspiration, aspirated fluids are typically replaced with irrigation of a balanced salt solution, in order to maintain fluid pressure in the anterior chamber.
In some cases, if deemed to be necessary, the capsule is polished. Subsequently, the intraocular lens (IOL) is inserted into the capsule. The IOL is typically foldable and is inserted in a folded configuration, before unfolding inside the capsule. If necessary, one or more of the incisions are sealed by elevating the pressure inside the bulbus oculi (i.e., the globe of the eye), causing the internal tissue to be pressed against the external tissue of the incisions, such as to force closed the incisions.
SUMMARY
In accordance with some applications of the present disclosure, a robotic system is provided for performing a robotic surgical procedure (e.g., a microsurgical procedure, such as an intraocular surgical procedure) on a portion of a body of a subject. Typically, the robotic system includes one or more robotic units configured to hold surgical tools, in addition to an imaging system, one or more displays, and a control-component unit, via which one or more operators (e.g., healthcare professionals, such as a physician and/or a nurse) control the robotic units. Typically, the robotic system includes at least one computer processor, via which components of the system and operator(s) operatively interact with each other.
Typically, the control-component unit includes one or more control components that are configured to correspond to respective robotic units of the robotic system. For example, the system may include first and second robotic units, and the control-component unit may include first and second control components. Typically, each of the control components includes an arm that includes a plurality of links that are coupled to each other via joints (e.g. , rotational joints or linear joints), and a control-component tool coupled to the links. (Typically, the control-component tool is coupled directly only to one of the links.) The computer processor determines the XYZ location and orientation of a portion (e.g., the tip) of the control-component tool, and drives the corresponding robotic unit to move a portion (e.g., the tip) of the surgical tool held by the robotic unit so as to track any changes in this location and orientation. In particular, the processor drives the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by the operator by applying a transformation to coordinates of a portion (e.g., the tip) of the control-component tool to compute corresponding coordinates of a portion (e.g., the tip) of the surgical tool. At the start of the surgical procedure, the processor executes a process for engaging the control-component tool with the surgical tool, such that movement of the control-component tool by the operator causes corresponding movement of the surgical tool by the robotic unit as described above. In some embodiments, this process includes displaying an image of the portion of the body and the surgical tool, overlaying, on the image, an icon representing the controlcomponent tool, such that a location and orientation of the icon tracks a location and orientation of the control-component tool, and in response to an alignment, by the operator, of the icon with the surgical tool in the image, engaging the control-component tool with the surgical tool.
In some cases, during the procedure, the operator may wish to temporarily disengage the control-component tool from the surgical tool, e.g., so as to move the control-component tool to a more convenient position within the workspace of the control-component tool. To address this need, in some embodiments, the processor is configured to disengage the control-component tool from the surgical tool, in response to a first input from the operator, such that the movement of the control-component tool does not cause any movement of the surgical tool, and to re-execute the process for engaging the control-component tool with the surgical tool in response to a second input from the operator. In some embodiments, the operator provides the first input by pressing an input interface such as a button or foot pedal, and provides the second input by releasing the input interface.
In some embodiments, while the control-component tool is temporarily disengaged from the surgical tool, the operator may bring the control-component tool or one of the links into contact with a surface such that the surface supports the control-component tool or the link. Following the re-engagement of the control-component tool with the surgical tool, the operator continues to control the surgical tool, using the control-component tool, while the surface supports the controlcomponent tool or the link. Advantageously, the surface helps stabilize the control-component tool. Alternatively or additionally, to help stabilize the control-component tool, the processor, in response to an input from the operator, increases the resistance of the control-component tool to forces applied to the control-component tool by the operator. In such applications, the processor typically increases the resistance of the control-component tool to forces applied to the controlcomponent tool by the operator, without disallowing movement of the control-component tool.
In some cases, when one surgical tool is replaced with another surgical tool having a different preferred orientation (e.g., a different preferred pitch), the operator changes the orientation (e.g., the pitch) of the control-component tool. However, given that the hand position of the operator typically does not change significantly, this change in orientation typically causes the position of the tip of the control-component tool to change. If the transformation, via which the processor transforms the control-component-tool coordinates to the surgical-tool coordinates, were to remain the same, the coordinates of the surgical tool would undesirably jump.
To address this challenge, in some embodiments, the processor is configured to use different transformations for different tool types. In particular, the processor is configured to identify, automatically or in response to a manual input, the type of surgical tool held by the robotic unit. In response to identifying the type, the processor selects a transformation from multiple predefined transformations. Advantageously, the predefined transformations are configured to account for the different preferred orientations of different types of surgical tools, thereby minimizing the change in the coordinates of the surgical tool when one surgical tool is replaced with another.
Other embodiments limit movement of the control-component tool with respect to a particular type of movement, such as translational movement in one or more directions, e.g., so as to keep the surgical tool away from a sensitive region of the body or to keep the surgical tool from moving laterally into the edge of an incision. In particular, the processor is configured to receive an input, from the operator, indicating that at least one type of movement of the surgical tool should be limited, without other types of movement of the surgical tool being limited. In response to the input, the processor limits movement of the control-component tool with respect to the type of movement but not with respect to the other types of movement.
There is therefore provided, in accordance with some applications of the present disclosure, apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the apparatus including: a control-component tool; and a processor, configured to: execute a process for engaging the control-component tool with the surgical tool such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit, by: displaying an image that includes a representation of the surgical tool, overlaying, on the image, an icon representing the control-component tool, such that a location and orientation of the icon tracks a location and orientation of the control-component tool, and in response to an alignment, by the operator, of the icon with the representation of the surgical tool in the image, engaging the control-component tool with the surgical tool, in response to a first input from the operator, disengage the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool, and in response to a second input from the operator, re-execute the process for engaging the control-component tool with the surgical tool.
In some embodiments, the processor is configured to display the representation of the surgical tool by displaying an icon representing the surgical tool.
In some embodiments, the processor is configured to display the representation of the surgical tool by displaying an image of the surgical tool.
In some embodiments, the processor is configured to display the representation of the surgical tool by displaying an image of the portion of the body and the surgical tool.
In some embodiments, the surgical tool includes an ophthalmic surgical tool.
In some embodiments, the apparatus further includes multiple links that are coupled to each other via one or more joints and are coupled to the control-component tool, and the processor is configured to re-execute the process for engaging the controlcomponent tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface, such that the surface supports the control-component tool or the one of the links when the control-component tool is re-engaged with the surgical tool.
In some embodiments, the apparatus further includes an input interface, and the operator provides the first input by pressing the input interface and provides the second input by releasing the input interface.
In some embodiments, the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
In some embodiments, the input interface includes a button on the control-component tool.
There is further provided, in accordance with some embodiments of the present disclosure, a method for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the method including: executing, by a processor, a process for engaging a control-component tool with the surgical tool such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit, by: displaying an image that includes a representation of the surgical tool, overlaying, on the image, an icon representing the control-component tool, such that a location and orientation of the icon tracks a location and orientation of the controlcomponent tool, and in response to an alignment, by the operator, of the icon with the surgical tool in the image, engaging the control-component tool with the surgical tool; in response to a first input from the operator, disengaging the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool; and in response to a second input from the operator, re-executing the process for engaging the control-component tool with the surgical tool.
In some embodiments, displaying the image that includes the representation of the surgical tool includes displaying an icon representing the surgical tool.
In some embodiments, displaying the image that includes the representation of the surgical tool includes displaying an image of the surgical tool.
In some embodiments, displaying the image that includes the representation of the surgical tool includes displaying an image of the portion of the body and the surgical tool.
In some embodiments, the surgical tool includes an ophthalmic surgical tool.
In some embodiments, the control-component tool is coupled to multiple links that are coupled to each other via one or more joints, and re-executing the process for engaging the control-component tool with the surgical tool includes re-executing the process for engaging the control-component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface, such that the surface supports the control-component tool or the one of the links when the controlcomponent tool is re-engaged with the surgical tool.
In some embodiments, the operator provides the first input by pressing an input interface and provides the second input by releasing the input interface.
In some embodiments, the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
In some embodiments, the input interface includes a button on the control-component tool. There is further provided, in accordance with some embodiments of the present disclosure, a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, by: executing a process for engaging a control-component tool with the surgical tool such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit, by: displaying an image that includes a representation of the surgical tool, overlaying, on the image, an icon representing the control-component tool, such that a location and orientation of the icon tracks a location and orientation of the controlcomponent tool, and in response to an alignment, by the operator, of the icon with the surgical tool in the image, engaging the control-component tool with the surgical tool, in response to a first input from the operator, disengaging the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool, and in response to a second input from the operator, re-executing the process for engaging the control-component tool with the surgical tool.
In some embodiments, displaying the image that includes the representation of the surgical tool includes displaying an icon representing the surgical tool.
In some embodiments, displaying the image that includes the representation of the surgical tool includes displaying an image of the surgical tool.
In some embodiments, displaying the image that includes the representation of the surgical tool includes displaying an image of the portion of the body and the surgical tool.
In some embodiments, the surgical tool includes an ophthalmic surgical tool.
In some embodiments, the control-component tool is coupled to multiple links that are coupled to each other via one or more joints, and re-executing the process for engaging the control-component tool with the surgical tool includes re-executing the process for engaging the control-component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface, such that the surface supports the control-component tool or the one of the links when the controlcomponent tool is re-engaged with the surgical tool.
In some embodiments, the operator provides the first input by pressing an input interface and provides the second input by releasing the input interface.
In some embodiments, the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
In some embodiments, the input interface includes a button on the control-component tool.
There is further provided, in accordance with some embodiments of the present disclosure, an apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the apparatus including: multiple links coupled to each other via one or more joints; a control-component tool coupled to the links; and a processor, configured to: engage the control-component tool with the surgical tool, such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit, in response to a first input from the operator, disengage the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool, and in response to a second input from the operator, re-engage the control-component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface such that the surface supports the control-component tool or the one of the links.
In some embodiments, the surface is horizontal.
In some embodiments, the surface supports the control-component tool or the one of the links while a hand of the operator, which holds the control-component tool, is supported by another surface.
In some embodiments, the processor is further configured to drive the robotic unit, following the re-engagement of the control-component tool with the surgical tool, to move the surgical tool correspondingly to the movement of the control-component tool while the surface supports the control-component tool or the one of the links. In some embodiments, the movement of the control-component tool is not constrained by the surface even while the surface supports the control-component tool or the one of the links.
In some embodiments, the control-component tool or the one of the links includes a compression spring and a retractable portion coupled to the compression spring, and the surface supports the control-component tool or the one of the links at the retractable portion, such that the retractable portion retracts, and the spring compresses, when the controlcomponent tool or the one of the links is pushed against the surface.
In some embodiments, the control-component tool is mounted to a top end of the one of the links, and the retractable portion is positioned at a bottom end of the one of the links.
In some embodiments, the apparatus further includes the surface, the links are mounted to the surface.
In some embodiments, the retractable portion is positioned at a tip of the controlcomponent tool.
In some embodiments, the apparatus further includes the surface and a compression spring, the surface is mounted to the compression spring such that the surface retracts, and the spring compresses, when the control-component tool or the one of the links is pushed against the surface.
There is further provided, in accordance with some embodiments of the present disclosure, a method for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the method including: engaging, by a processor, a control-component tool, which is coupled to multiple links that are coupled to each other via one or more joints, with the surgical tool, such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit; in response to a first input from the operator, disengaging the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool; and in response to a second input from the operator, re-engaging the control-component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface such that the surface supports the control-component tool or the one of the links. In some embodiments, the surface is horizontal.
In some embodiments, the surface supports the control-component tool or the one of the links while a hand of the operator, which holds the control-component tool, is supported by another surface.
In some embodiments, the method further includes, following the re-engagement of the control-component tool with the surgical tool, driving the robotic unit to move the surgical tool correspondingly to the movement of the control-component tool while the surface supports the control-component tool or the one of the links.
In some embodiments, the movement of the control-component tool is not constrained by the surface even while the surface supports the control-component tool or the one of the links.
In some embodiments, the control-component tool or the one of the links includes a compression spring and a retractable portion coupled to the compression spring, and the surface supports the control-component tool or the one of the links at the retractable portion, such that the retractable portion retracts, and the spring compresses, when the controlcomponent tool or the one of the links is pushed against the surface.
In some embodiments, the control-component tool is mounted to a top end of the one of the links, and the retractable portion is positioned at a bottom end of the one of the links.
In some embodiments, the links are mounted to the surface.
In some embodiments, the retractable portion is positioned at a tip of the controlcomponent tool.
In some embodiments, the surface is mounted to a compression spring such that the surface retracts, and the spring compresses, when the control-component tool or the one of the links is pushed against the surface.
There is further provided, in accordance with some embodiments of the present disclosure, a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, by: engaging a control-component tool, which is coupled to multiple links that are coupled to each other via one or more joints, with the surgical tool, such that movement of the controlcomponent tool by an operator causes corresponding movement of the surgical tool by the robotic unit, in response to a first input from the operator, disengaging the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool, and in response to a second input from the operator, re-engaging the control-component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface such that the surface supports the control-component tool or the one of the links.
In some embodiments, the surface is horizontal.
In some embodiments, the surface supports the control-component tool or the one of the links while a hand of the operator, which holds the control-component tool, is supported by another surface.
In some embodiments, the instructions further cause the processor to drive the robotic unit, following the re-engagement of the control-component tool with the surgical tool, to move the surgical tool correspondingly to the movement of the control-component tool while the surface supports the control-component tool or the one of the links.
In some embodiments, the movement of the control-component tool is not constrained by the surface even while the surface supports the control-component tool or the one of the links.
In some embodiments, the control-component tool or the one of the links includes a compression spring and a retractable portion coupled to the compression spring, and the surface supports the control-component tool or the one of the links at the retractable portion, such that the retractable portion retracts, and the spring compresses, when the controlcomponent tool or the one of the links is pushed against the surface.
In some embodiments, the control-component tool is mounted to a top end of the one of the links, and the retractable portion is positioned at a bottom end of the one of the links.
In some embodiments, the links are mounted to the surface.
In some embodiments, the retractable portion is positioned at a tip of the control- component tool.
In some embodiments, the surface is mounted to a compression spring such that the surface retracts, and the spring compresses, when the control-component tool or the one of the links is pushed against the surface.
There is further provided, in accordance with some embodiments of the present disclosure, an apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the apparatus including: a control-component tool; and a processor, configured to: identify a type of the surgical tool, in response to identifying the type, select a transformation from multiple predefined transformations, and drive the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by an operator, by applying the selected transformation to coordinates of the control-component tool to compute corresponding coordinates of the surgical tool.
In some embodiments, the surgical tool includes an ophthalmic surgical tool.
In some embodiments, the predefined transformations include respective translations that vary from each other.
In some embodiments, the translations vary from each other along a vertical axis.
In some embodiments, the processor is configured to apply the selected transformation to coordinates of a tip of the control-component tool to compute corresponding coordinates of a tip of the surgical tool.
In some embodiments, the predefined transformations map different respective coordinates of the control-component tool to the same coordinates of the surgical tool.
In some embodiments, the predefined transformations correspond to different respective orientations at which surgical tools are held during the surgical procedure.
In some embodiments, the predefined transformations correspond to different respective pitches at which surgical tools are held during the surgical procedure.
In some embodiments, the processor is configured to select a different one of the predefined transformations in response to the surgical tool being of a non-straight type, relative to the surgical tool being of a straight type.
In some embodiments, the non-straight type is selected from the group of types consisting of: an ophthalmic chopper, and forceps.
In some embodiments, the straight type is selected from the group of types consisting of: an intraocular injector, and a phacoemulsification handpiece.
In some embodiments, the processor is further configured to, prior to identifying the type of the surgical tool: drive the robotic unit to move another surgical tool correspondingly to the movement of the control-component tool, by applying another one of the predefined transformations to the coordinates of the control-component tool to compute the corresponding coordinates of the surgical tool, and disengage the control-component tool from the other surgical tool, such that the movement of the control-component tool does not cause any movement of the other surgical tool, and the processor is configured to identify the type of the surgical tool in response to the surgical tool replacing the other surgical tool while the control-component tool is disengaged from the other surgical tool.
In some embodiments, the application of the selected transformation reduces required movement of the control-component tool following the surgical tool replacing the other surgical tool, relative to if the other one of the predefined transformations were applied following the surgical tool replacing the other surgical tool.
There is further provided, in accordance with some embodiments of the present disclosure, a method for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the method including: identifying, by a processor, a type of the surgical tool; in response to identifying the type, selecting a transformation, by the processor, from multiple predefined transformations; and driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator, by applying the selected transformation to coordinates of the control-component tool to compute corresponding coordinates of the surgical tool.
In some embodiments, the surgical tool includes an ophthalmic surgical tool. In some embodiments, the predefined transformations include respective translations that vary from each other.
In some embodiments, the translations vary from each other along a vertical axis.
In some embodiments, applying the selected transformation includes applying the selected transformation to coordinates of a tip of the control-component tool to compute corresponding coordinates of a tip of the surgical tool.
In some embodiments, the predefined transformations map different respective coordinates of the control-component tool to the same coordinates of the surgical tool.
In some embodiments, the predefined transformations correspond to different respective orientations at which surgical tools are held during the surgical procedure.
In some embodiments, the predefined transformations correspond to different respective pitches at which surgical tools are held during the surgical procedure.
In some embodiments, selecting the transformation includes selecting a different one of the predefined transformations in response to the surgical tool being of a non-straight type, relative to the surgical tool being of a straight type.
In some embodiments, the non-straight type is selected from the group of types consisting of: an ophthalmic chopper, and forceps.
In some embodiments, the straight type is selected from the group of types consisting of: an intraocular injector, and a phacoemulsification handpiece.
In some embodiments, the method further includes, prior to identifying the type of the surgical tool: driving the robotic unit to move another surgical tool correspondingly to the movement of the control-component tool, by applying another one of the predefined transformations to the coordinates of the control-component tool to compute the corresponding coordinates of the surgical tool; and disengaging the control-component tool from the other surgical tool, such that the movement of the control-component tool does not cause any movement of the other surgical tool, identifying the type of the surgical tool includes identifying the type of the surgical tool in response to the surgical tool replacing the other surgical tool while the control-component tool is disengaged from the other surgical tool.
In some embodiments, the application of the selected transformation reduces required movement of the control-component tool following the surgical tool replacing the other surgical tool, relative to if the other one of the predefined transformations were applied following the surgical tool replacing the other surgical tool.
There is further provided, in accordance with some embodiments of the present disclosure, a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, by: identifying a type of the surgical tool, in response to identifying the type, selecting a transformation from multiple predefined transformations, and driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator, by applying the selected transformation to coordinates of the control-component tool to compute corresponding coordinates of the surgical tool.
In some embodiments, the surgical tool includes an ophthalmic surgical tool.
In some embodiments, the predefined transformations include respective translations that vary from each other.
In some embodiments, the translations vary from each other along a vertical axis.
In some embodiments, the instructions cause the processor to apply the selected transformation to coordinates of a tip of the control-component tool to compute corresponding coordinates of a tip of the surgical tool.
In some embodiments, the predefined transformations map different respective coordinates of the control-component tool to the same coordinates of the surgical tool.
In some embodiments, the predefined transformations correspond to different respective orientations at which surgical tools are held during the surgical procedure.
In some embodiments, the predefined transformations correspond to different respective pitches at which surgical tools are held during the surgical procedure.
In some embodiments, the instructions cause the processor to select a different one of the predefined transformations in response to the surgical tool being of a non-straight type, relative to the surgical tool being of a straight type. In some embodiments, the non- straight type is selected from the group of types consisting of: an ophthalmic chopper, and forceps.
In some embodiments, the straight type is selected from the group of types consisting of: an intraocular injector, and a phacoemulsification handpiece.
In some embodiments, the instructions further cause the processor to, prior to identifying the type of the surgical tool: drive the robotic unit to move another surgical tool correspondingly to the movement of the control-component tool, by applying another one of the predefined transformations to the coordinates of the control-component tool to compute the corresponding coordinates of the surgical tool, and disengage the control-component tool from the other surgical tool, such that the movement of the control-component tool does not cause any movement of the other surgical tool, and the instructions cause the processor to identify the type of the surgical tool in response to the surgical tool replacing the other surgical tool while the control-component tool is disengaged from the other surgical tool.
In some embodiments, the application of the selected transformation reduces required movement of the control-component tool following the surgical tool replacing the other surgical tool, relative to if the other one of the predefined transformations were applied following the surgical tool replacing the other surgical tool.
There is further provided, in accordance with some embodiments of the present disclosure, an apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the apparatus including: a control-component tool; and a processor, configured to: drive the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by an operator, receive an input from the operator, and in response to the input, increase a resistance of the control -component tool to forces applied to the control-component tool by the operator, without disallowing the movement.
In some embodiments, the apparatus further includes an input interface, the operator provides the input by pressing the input interface. In some embodiments, the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
In some embodiments, the input interface includes a button on the control-component tool.
In some embodiments, the surgical tool includes an ophthalmic surgical tool.
In some embodiments, the control-component tool is held by a non-dominant hand of the operator.
In some embodiments, the apparatus further includes another control-component tool, the control-component tool is held by a first hand of the operator, and the processor is configured to increase the resistance while a second hand of the operator moves the other control-component tool.
In some embodiments, the apparatus further includes: a plurality of links coupled to each other via one or more joints and coupled to the controlcomponent tool; and respective motors operatively coupled to the joints, the processor is configured to increase the resistance by increasing a counterforce applied to the joints by the motors.
There is further provided, in accordance with some embodiments of the present disclosure, a method for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the method including: driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator; receiving an input from the operator; and in response to the input, increasing a resistance of the control-component tool to forces applied to the control-component tool by the operator, without disallowing the movement.
In some embodiments, the operator provides the input by pressing an input interface.
In some embodiments, the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
In some embodiments, the input interface includes a button on the control-component tool.
In some embodiments, the surgical tool includes an ophthalmic surgical tool.
In some embodiments, the control-component tool is held by a non-dominant hand of the operator.
In some embodiments, the control-component tool is held by a first hand of the operator, and increasing the resistance includes increasing the resistance while a second hand of the operator moves another control-component tool.
In some embodiments, the control-component tool is coupled to a plurality of links that are coupled to each other via one or more joints, respective motors are operatively coupled to the joints, and increasing the resistance includes increasing the resistance by increasing a counterforce applied to the joints by the motors.
There is further provided, in accordance with some embodiments of the present disclosure, a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, by: driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator, receiving an input from the operator, and in response to the input, increasing a resistance of the control-component tool to forces applied to the control-component tool by the operator, without disallowing the movement.
In some embodiments, the operator provides the input by pressing an input interface.
In some embodiments, the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
In some embodiments, the input interface includes a button on the control-component tool.
In some embodiments, the surgical tool includes an ophthalmic surgical tool.
In some embodiments, the control-component tool is held by a non-dominant hand of the operator.
In some embodiments, the control-component tool is held by a first hand of the operator, and increasing the resistance includes increasing the resistance while a second hand of the operator moves another control-component tool. In some embodiments, the control-component tool is coupled to a plurality of links that are coupled to each other via one or more joints, respective motors are operatively coupled to the joints, and increasing the resistance includes increasing the resistance by increasing a counterforce applied to the joints by the motors.
There is further provided, in accordance with some embodiments of the present disclosure, an apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the apparatus including: a control-component tool; and a processor, configured to: drive the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by an operator, receive an input, from the operator, indicating that at least one type of movement of the surgical tool should be limited, without other types of movement of the surgical tool being limited, and in response to the input, limit the movement of the control-component tool with respect to the type of movement but not with respect to the other types of movement.
In some embodiments, the apparatus further includes an input interface, and the operator provides the input by pressing the input interface.
In some embodiments, the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
In some embodiments, the input interface includes a button on the control-component tool.
In some embodiments, the surgical tool includes an ophthalmic surgical tool.
In some embodiments, the apparatus further includes: a plurality of links coupled to each other via one or more joints and coupled to the control-component tool; and respective motors operatively coupled to the joints, and the processor is configured to limit the movement by increasing a counterforce applied to the joints by the motors.
In some embodiments, the processor is configured to limit the movement by disallowing the movement with respect to the type of movement.
In some embodiments, prior to the limiting of the movement, the control -component tool has multiple degrees of freedom, and the processor is configured to limit the movement of the control-component tool with respect to at least one of the degrees of freedom.
In some embodiments, prior to the limiting of the movement, the control -component tool has six degrees of freedom.
In some embodiments, the processor is configured to limit translational movement of the control-component tool in at least one direction without limiting any rotational movement of the control-component tool.
In some embodiments, the direction is defined with respect to an orientation of the controlcomponent tool.
In some embodiments, the direction is along a longitudinal axis of the control-component tool.
In some embodiments, the processor is configured to limit all translational movement.
In some embodiments, the processor is configured to inhibit the surgical tool from moving deeper into the portion of the body of the subject by limiting the movement.
In some embodiments, the portion of the body includes an eye of the subject.
In some embodiments, the processor is configured to inhibit the surgical tool from moving into an edge of an incision in the body of the subject by limiting the movement.
In some embodiments, the incision includes a corneal incision.
There is further provided, in accordance with some embodiments of the present disclosure, a method for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the method including: driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator; receiving an input, from the operator, indicating that at least one type of movement of the surgical tool should be limited, without other types of movement of the surgical tool being limited; and in response to the input, limiting the movement of the control-component tool with respect to the type of movement but not with respect to the other types of movement. In some embodiments, the operator provides the input by pressing an input interface.
In some embodiments, the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
In some embodiments, the input interface includes a button on the control-component tool.
In some embodiments, the surgical tool includes an ophthalmic surgical tool.
In some embodiments, the control-component tool is coupled to a plurality of links that are coupled to each other via one or more joints, respective motors are operatively coupled to the joints, and limiting the movement includes limiting the movement by increasing a counterforce applied to the joints by the motors.
In some embodiments, limiting the movement includes disallowing the movement with respect to the type of movement.
In some embodiments, prior to the limiting of the movement, the control -component tool has multiple degrees of freedom, and limiting the movement of the control-component tool includes limiting the movement of the control-component tool with respect to at least one of the degrees of freedom.
In some embodiments, prior to the limiting of the movement, the control -component tool has six degrees of freedom.
In some embodiments, limiting the movement includes limiting translational movement of the control-component tool in at least one direction without limiting any rotational movement of the control-component tool.
In some embodiments, the direction is defined with respect to an orientation of the controlcomponent tool.
In some embodiments, the direction is along a longitudinal axis of the control-component tool.
In some embodiments, limiting the translational movement includes limiting all translational movement.
In some embodiments, the method includes, by limiting the movement, inhibiting the surgical tool from moving deeper into the portion of the body of the subject. In some embodiments, the portion of the body includes an eye of the subject.
In some embodiments, the method includes, by limiting the movement, inhibiting the surgical tool from moving into an edge of an incision in the body of the subject.
In some embodiments, the incision includes a corneal incision.
There is further provided, in accordance with some embodiments of the present disclosure, a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, by: driving the robotic unit to move the surgical tool correspondingly to movement of a control-component tool by an operator, receiving an input, from the operator, indicating that at least one type of movement of the surgical tool should be limited, without other types of movement of the surgical tool being limited, and in response to the input, limiting the movement of the control-component tool with respect to the type of movement but not with respect to the other types of movement.
There is further provided, in accordance with some embodiments of the present disclosure, an apparatus for performing a procedure on a portion of a body of a patient using a surgical tool that has a tip, the apparatus being for use with a robotic unit configured to move the surgical tool, the apparatus including: a control-component unit that includes: a control-component tool that is configured to be moved by an operator and that defines a tip; and an inertial measurement unit including at least one location sensor selected from the group consisting of: a three-axis accelerometer, a three-axis gyroscope, and a three- axis magnetometer, the inertial measurement unit being configured to generate inertial- measurement-unit data indicative of an orientation of the tip of the control-component tool; and a computer processor configured to: determine a location and orientation of the tip of the control-component tool based upon the inertial-measurement-unit data received from the location sensor, drive the robotic unit to move the tip of the surgical tool within the portion of the body of the patient in a manner that corresponds with movement of the tip of the control-component tool, and recalibrate the inertial measurement unit in response to the controlcomponent tool being docked at a known orientation within the control-component unit.
There is further provided, in accordance with some embodiments of the present disclosure, a method for performing a procedure on a portion of a body of a patient using a surgical tool that has a tip, the method being for use with: a robotic unit configured to move the surgical tool, and a control-component unit that includes: a control-component tool that is configured to be moved by an operator and that defines a tip, and an inertial measurement unit including at least one location sensor selected from the group consisting of: a three-axis accelerometer, a three-axis gyroscope, and a three- axis magnetometer, the inertial measurement unit being configured to generate inertial- measurement-unit data indicative of an orientation of the tip of the control-component tool, the method including: using a computer processor, determining a location and orientation of the tip of the control-component tool based upon the inertial-measurement-unit data received from the location sensor, driving the robotic unit to move the tip of the surgical tool within the portion of the body of the patient in a manner that corresponds with movement of the tip of the controlcomponent tool, and recalibrating the inertial measurement unit in response to the control-component tool being docked at a known orientation within the control-component unit.
There is further provided, in accordance with some embodiments of the present disclosure, a computer software product for use with: a robotic unit configured to move a surgical tool that has a tip, and a control-component unit that includes: a control-component tool that is configured to be moved by an operator and that defines a tip, and an inertial measurement unit including at least one location sensor selected from the group consisting of: a three-axis accelerometer, a three-axis gyroscope, and a three- axis magnetometer, the inertial measurement unit being configured to generate inertial- measurement-unit data indicative of an orientation of the tip of the control-component tool, the computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to perform a procedure on a portion of a body of a patient using the surgical tool, by: determining a location and orientation of the tip of the control-component tool based upon the inertial-measurement-unit data received from the location sensor, driving the robotic unit to move the tip of the surgical tool within the portion of the body of the patient in a manner that corresponds with movement of the tip of the controlcomponent tool, and recalibrating the inertial measurement unit in response to the control-component tool being docked at a known orientation within the control-component unit.
There is further provided, in accordance with some embodiments of the present disclosure, an apparatus for performing a procedure on an eye of a patient using two or more tools, the apparatus including: a first control component configured to be operated by a first hand of an operator and a second control component configured to be operated by a second hand of the operator, the first control component and second control component being positioned in a given configuration with respect to each other; a first robotic unit configured to hold a first one of the two or more tools, and a second robotic unit being configured to hold a second one of the two or more tools, the first robotic unit and second robotic unit corresponding respectively to the first control component and second control component, the first robotic unit being positioned in a non-standard position with respect to the eye of the patient, so that the first robotic unit and second robotic unit are positioned with respect to each other in a configuration that is different from the given configuration; and at least one computer processor configured to process inputs into the each of the first control component and second control component such as to generate outputs at the corresponding robotic unit, the computer processor being configured to convert inputs from the operator regarding movements and actions that are provided to the first control component within a first control-component tool frame of reference to corresponding movements and actions of the first robotic unit within a frame of reference of the first robotic unit.
In some embodiments, the first robotic unit is configured to insert the first tool into the patient’s eye from an inferior position with respect to the patient’s eye.
In some embodiments, the first robotic unit is configured to insert the first tool into the patient’s eye via an incision in a cornea of the patient’s eye that is configured to treat an astigmatism of the cornea.
The present disclosure will be more fully understood from the following detailed description of embodiments thereof, taken together with the drawings, in which:
BRIEF DESCRIPTION OF THE DRAWINGS
Figs. 1A and IB are schematic illustrations of a robotic system that is configured for use in a microsurgical procedure, such as intraocular surgery, in accordance with some applications of the present disclosure;
Fig 2A is a schematic illustration of a display showing a surgical tool placed laterally with respect to a patient’s cornea, in accordance with some applications of the present disclosure;
Fig. 2B is a schematic illustration of a display showing an augmented surgical tool overlaid on the laterally-placed surgical tool, such that the tip of the augmented surgical tool is placed in a vicinity of the tip of the surgical tool, in accordance with some applications of the present disclosure;
Figs. 2C and 2D are schematic illustrations of a display showing an augmented controlcomponent tool being positioned such as to overlay the augmented surgical tool in order to engage a first control-component tool with the robotic system, in accordance with some applications of the present disclosure;
Fig. 2E is a schematic illustration of a display showing a surgical tool placed superiorly with respect to a patient’s cornea, in accordance with some applications of the present disclosure;
Fig. 2F is a schematic illustration of a display showing an augmented surgical tool overlaid on the superiorly-placed surgical tool, such that the tip of the augmented surgical tool is placed in a vicinity of the tip of the surgical tool, in accordance with some applications of the present disclosure;
Figs. 2G and 2H are schematic illustrations of a display showing an augmented controlcomponent tool being positioned such as to overlay the augmented surgical tool in order to engage a second control-component tool with the robotic system, in accordance with some applications of the present disclosure; Fig. 21 is a schematic illustration of a display showing an augmented control-component tool being moved away from an augmented surgical tool in order to disengage a controlcomponent tool from the robotic system, in accordance with some applications of the present disclosure;
Fig. 3A is a schematic illustration of the robotic system marked with cuboids that indicate workspaces of a control component and a surgical tool for illustrative purposes, in accordance with some applications of the present disclosure;
Fig. 3B is a schematic illustration of the robotic system annotated with several frames of references for illustrative purposes, in accordance with some applications of the present disclosure;
Figs. 4A, 4B, 4C, and 4D are schematic illustrations of a control component of a controlcomponent unit, in accordance with some applications of the present disclosure;
Figs. 5A and 5B are schematic illustrations of a workstation, in accordance with some embodiments of the present disclosure;
Fig. 6 schematically illustrates use of a tool-type- specific transformation, in accordance with some embodiments of the present disclosure; and
Fig. 7 is flow diagram for controlling surgical tools using tool-type- specific transformations, in accordance with some embodiments of the present disclosure.
DETAILED DESCRIPTION
Reference is now made to Figs. 1A and IB, which are schematic illustrations of a robotic system 10 that is configured for use in a microsurgical procedure, such as intraocular surgery, in accordance with some applications of the present disclosure. Typically, when used for intraocular surgery, robotic system 10 includes one or more robotic units 20, which are configured to hold tools 21, and an imaging system 22. System 10 further includes one or more displays 24 and a control-component unit 26 (e.g., a control-component unit that includes a pair of control components, as shown in the enlarged portion of Fig. 1A), which are typically located at a workstation 84. Using control-component unit 26, one or more operators 25 (e.g., healthcare professionals, such as a physician 25A and/or a nurse 25B) control robotic units 20. Typically, robotic system 10 includes one or more computer processors 28, via which components of the system and operator(s) 25 operatively interact with each other.
Figs. 1A and IB show different setups of a robotic system 10 that is configured for ophthalmic surgery. As shown, in the configuration shown in Fig. 1 A first and second robotic units are disposed at respective lateral positions (i.e., left and right) with respect to the eye that is being operated on, such that tools 21 that are held by the robotic units are disposed at approximately 180 degrees from each other. The configuration shown in Fig. IB shows a first robotic unit that is placed laterally with respect to the eye and a second robotic unit positioned in a superior position with respect to the eye, such that tools 21 that are held by the robotic units are disposed at approximately 90 degrees from each other. (In the context of ophthalmic procedures, the lateral position shown in Fig. IB is referred to as the “temporal” position. As such, the terms “lateral” and “temporal” are used interchangeably in the present application.) In some cases (not shown), the first robotic unit is placed laterally with respect to the eye and the second robotic unit positioned in an inferior position with respect to the eye, such that tools 21 that are held by the robotic units are disposed at approximately 90 degrees from each other. In general, the scope of the present disclosure includes using any number of robotic units placed at any number of respective positions in relation to the patient, and the configurations shown in Figs. 1A and IB should not be interpreted as limiting the scope of the disclosure in any way.
Typically, movement of the robotic units (and/or control of other aspects of the robotic system) is at least partially controlled by one or more operators (e.g., healthcare professionals, such as a physician 25A and/or a nurse 25B). For example, the operator may receive images of the patient's eye and the robotic units and/or tools disposed therein, via display 24. Typically, such images are acquired by imaging system 22. For some applications, imaging system 22 is a stereoscopic imaging device and display 24 is a stereoscopic display. Based on the received images, the operator typically performs steps of the procedure.
For some applications, the operator provides commands to the robotic units via controlcomponent unit 26. For example, Figs. 1A and IB show physician 25A providing commands to the robotic units via control-component unit 26, while viewing images of the patient’s eye and tools 21 upon display 24. Typically, such commands include commands that control the position and/or orientation of tools that are disposed within the robotic units, and/or commands that control actions that are performed by the tools. For example, the commands may control a blade, a phacoemulsification tool (e.g., the operation mode and/or suction power of the phacoemulsification tool), forceps (e.g., opening and closing of forceps), an intraocular-lensmanipulator tool (e.g., such that the tool manipulates the intraocular lens inside the eye for precise positioning of the intraocular lens within the eye), and/or injector tools (e.g., which fluid (e.g., viscoelastic fluid, saline, etc.) should be injected, and/or at what flow rate). Alternatively or additionally, the operator may input commands that control the imaging system (e.g., the zoom, focus, orientation, and/or XYZ positioning of the imaging system).
Typically, the control-component unit includes one or more control components 30 that are configured to correspond to respective robotic units 20 of the robotic system. For example, as shown, the system may include first and second robotic units, and the control -component unit may include first and second control components, as shown. Typically, each of the control components is an arm 31 that includes a plurality of links that are coupled to each other via joints. For some applications, the control-components include respective control-component tools 32 (that are typically configured to replicate the robotic units), as shown in Fig. 1A. Typically, the computer processor determines the XYZ location and orientation of the tip of the control -component tool 32, and drives the robotic unit such that the tip of the actual tool 21 that is being used to perform the procedure tracks the movements of the tip of the control-component tool and such that changes in the orientation of tool 21 track changes in the orientation of the control-component tool. For some applications, movement of the control-component tool by the operator is scaled up or down by the computer processor, as described in further detail hereinbelow.
In some cases, tool 21 is described herein, in the specification and in the claims, as a “surgical tool.” This term is used in order to distinguish tool 21 from control-component tool 32, and should not be interpreted as limiting the type of tool that may be used as tool 21 in any way. The term “surgical tool” should be interpreted to include any one the tools described herein and or any other types of tools that may occur to a person of ordinary skill in the art upon reading the present disclosure. Typically, for ophthalmic procedures, the surgical tool is an ophthalmic tool, e.g., one of the ophthalmic tools described hereinabove.
Typically, the right control component controls movement of the surgical tool that is toward the right of the patient’ s head when viewing the patient from a superior position (and which would normally be controlled by the physician’s right hand), and the left control component controls movement of the surgical tool that is toward the left of the patient’s head when viewing the patient from a superior position (and which would normally be controlled by the physician’s left hand).
As noted above, typically, the computer processor determines the XYZ location and orientation of the tip of the control-component tool 32, and drives the robotic unit such that the tip of the actual tool 21 that is being used to perform the procedure tracks the movements of the tip of the control-component tool and such that changes in the orientation of tool 21 track changes in the orientation of the control-component tool. Thus, if the orientation of the control-component tool changes, the computer processor typically changes the orientation of the surgical tool to correspond with the change in the orientation of the control-component tool. For this purpose, it is typically desirable that the control-component tool becomes engaged with the surgical tool of the robotic unit (such that the movements of the control-component tool control movement of the surgical tool) with the orientations of the surgical tool and the control-component tool being substantially similar to each other. If the orientations of the surgical tool and the controlcomponent tool are dissimilar from each other, this can lead to the operator being disoriented, which may in turn lead to discomfort, extended surgical durations, and erroneous movements due to the operator’s disorientation.
The operator (e.g., physician 25A) assumes and relinquishes control of the surgical tool (via the control-component tool) multiple times during a procedure, especially when the procedure requires the use of multiple tools that are changed during surgery (as is typically the case with surgical procedures, as described above). Typically, over the course of a procedure (a) the operator assumes control of the surgical tool, (b) the operator performs surgical actions with the surgical tool, (c) the operator relinquishes control of the surgical tool, (d) the robotic system or an operator (e.g., a nurse) removes the surgical tool from the robotic unit, (e) the robotic system or the operator places a new surgical tool on the robotic unit, and steps (a)-(e) are repeated.
Typically, the control of the surgical tool by the operator has limitations. For example, the workspace in which the operator can move the control-component tool (referred to hereinafter “the control-component workspace”) is typically physically constrained by where it is comfortable or even possible for the operator to move the control-component tool. In addition, the workspace of the surgical tool (referred to hereinafter “the tool workspace”) is typically physically constrained by the space within which it is possible for the robotic arm to move the surgical tool. Typically, in cases in which there are a plurality of control components and a corresponding plurality of surgical tools, each of the control-component tools has a respective control-component workspace, and each of the surgical tools has a respective tool workspace. In some cases, one of the limitations on the control-component workspace for one of the control-component tools is that it impinges on the control-component workspace of a second control-component tool. Similarly, some cases, one of the limitations on the tool workspace for one of the surgical tools is that it impinges on the tool workspace of a second one of the surgical tools.
The control-component workspace should be such that the control-component tool has sufficient freedom of movement such as to have the ability to control movement of the surgical tool within the surgical space. If the operator assumes control of the surgical tool (via the control- component tool) when the control-component tool is close to the edge of the control-component workspace, the movement of the control-component tool (and therefore that of the surgical tool) will be limited. Therefore, it is typically preferable for the operator to engage the controlcomponent tool with the surgical tool when the control-component tool is positioned and oriented such that the operator has good freedom of movement of the control-component tool.
The tool workspace should ideally cover the space within which the tool is expected to be manipulated for the purpose of the surgery (hereinafter “the surgical space”). If the operator assumes control of the surgical tool (via the control-component tool) when the surgical tool is at the edge of the tool workspace, the movement of the surgical tool will be limited. Therefore, it is typically preferable for the operator to engage the control-component tool with the surgical tool when the surgical tool is positioned and oriented such that it has good freedom of movement.
In other words, the operator should be able to freely move the surgical tool to all positions and orientation within the surgical space, without the control-component tool reaching the limits of the control-component workspace and without the surgical tool reaching the limits of the tool workspace.
In accordance with some applications of the present disclosure, the control-component tool becomes engaged with the surgical tool (such that the movements of the control-component tool control movement of the surgical tool) with the orientation of the surgical tool and the controlcomponent tool being substantially similar to each other, thereby avoiding disorientation of the operator. For some applications, the control-component tool becomes engaged with the surgical tool, when the surgical tool and the control-component tool are toward the centers of the controlcomponent workspace and the tool workspace, respectively. Typically, the operator is able to engage the surgical tool and the control-component tool to each other and/or disengage the surgical tool and the control-component tool from each other using standard movements of the controlcomponent tool and without requiring additional external inputs.
Some steps of the engagement and disengagement of the surgical tool and the controlcomponent tool in accordance with some applications of the present disclosure are described below with reference to Figs. 2A-2I.
In addition to moving the control-component tools, the operator can provide inputs to system 10 using any suitable input interface such as a foot pedal 27, a button 33, such as a button on one of the control-component tools, a keyboard, a mouse, or a touch screen belonging to display 24. Reference is now made to Fig 2A, which is a schematic illustration of display 24 showing surgical tool 21 placed laterally with respect to a patient’s cornea 38, in accordance with some applications of the present disclosure.
The process for engaging the control-component tool with the surgical tool, such that movement of the control-component tool by the operator causes corresponding movement of the surgical tool by the robotic unit, begins with the display of an image of the eye and the surgical tool on display 24, as shown in Fig. 2A. Typically, physician 25A views the image on display 24. As indicated in Figs. 1A and IB, typically the imaging system acquires anterior images of the patient’s eye, since the imaging system is typically disposed above the patient’s eye. As described hereinabove, for some applications, display 24 is a three-dimensional stereoscopic display. Typically, the imaging system is configured to acquire images that encompass the entire surgical space and display 24 displays the acquired images, such that any manipulation of the surgical tools occurs within the field of view that is displayed on display 24.
For some applications, the robotic system is configured to automatically place the surgical tool in position in the vicinity of the patient’s cornea (e.g., based upon images that are acquired by imaging system 22). Alternatively or additionally, the operator (e.g., nurse 25B) places the surgical tool in position in the vicinity of the patient’s cornea. Typically, the surgical tool is positioned in the vicinity of the patient’ s cornea (by the robotic system and/or by the nurse) at a relatively central position within the tool workspace, and/or such that from this initial position within the tool workspace there is no movement of the control-component tool within the control-component workspace that would result in the tool being moved out of the tool workspace. Thus, starting from this position typically allows completion of a step of the procedure that is to be performed by the surgical tool, without the need to reposition the robotic unit during the step of the procedure.
Referring now to Fig. 2B, for some applications, once surgical tool 21 is positioned in the vicinity of the patient’s cornea, the computer processor identifies the surgical tool and generates an augmented surgical tool 40 (i.e., an augmented image of a surgical tool) that overlays the image of the surgical tool itself. Advantageously, augmented surgical tool 40 facilitates identification of the surgical tool by the physician. Alternatively, an augmented surgical tool is not displayed. For some applications, the physician provides an input to the computer processor indicating whether or not she/he would like an augmented surgical tool to be displayed, and the computer processor controls the image that is displayed to the physician based on the input.
Reference is now made to Figs. 2C and 2D, which are schematic illustrations of display 24 showing an icon 42 being positioned such as to overlay augmented surgical tool 40 in order to engage a first control-component tool with the robotic system, in accordance with some applications of the present disclosure.
It is noted that although Figs. 2C and 2D show an image of the patient’s eye as well as surgical tool 21 and augmented surgical tool 40, the scope of the present disclosure includes displaying an image that includes any representation of the surgical tool, in order to facilitate the engagement of a control-component tool with the robotic system. For example, the display may display only an icon representing the surgical tool (e.g., an augmented image of a surgical tool, e.g., augmented surgical tool 40), or may display an image of the surgical tool itself, or may display an image of the surgical tool and the patient’s eye, or may display an icon representing the surgical tool (e.g., an augmented image of a surgical tool, e.g., augmented surgical tool 40) overlaid on the patient’s eye.
In the next step of the engagement process, the processor overlays, on the image, an icon 42 representing the control-component tool. Icon 42 is typically a virtual representation of the control-component tool, i.e., icon 42 is typically a computer-generated graphic that appears similar to the control-component tool. (Hence, icon 42 is also referred to herein as an “augmented controlcomponent tool.”) Alternatively, icon 42 is an image of the control-component tool.
As described hereinabove, typically the imaging system acquires anterior images of the patient’s eye, since the imaging system is typically disposed above the patient’s eye. Typically, the view of the patient’s eye that is displayed by the display is as if the eye is being viewed from a superior position relative to the patient’s head, since this is the view that a physician is accustomed to seeing during ophthalmic surgery. Display 24 typically faces the physician’s face and the orientation and position of icon 42 on display 24 are rotated in accordance with the view of the eye that is shown by the display. Typically, the orientation of icon 42 within the frame of reference of the display is substantially similar to the orientation of the control -component tool within the control-component workspace frame of reference. However (due to the frame of reference of the display being different from that of the control-component workspace), the absolute position of the control-component tool is typically unrelated to the absolute position of the icon.
The location and orientation of icon 42 tracks the location and orientation of the controlcomponent tool, i.e., movements of the control-component tool by the physician generate corresponding movements of the icon on display 24. In other words, rotation of the controlcomponent tool through angular rotations (roll, pitch, and/or yaw) generates corresponding rotations of the icon on display 24, and translational movement (along the X, Y, or Z directions) of the control-component tool generates corresponding translational motion of the icon on display 24. For some applications, the translational motion of the icon on display 24 is scaled up or down relative to the translational motion of the control-component tool.
Next, the physician aligns the icon with the surgical tool in the image, i.e., the physician moves the control-component tool such that the icon is aligned with the image of surgical tool 21 and/or augmented surgical tool 40. Typically, the icon is considered aligned with the surgical tool when (a) the tip of the icon overlays the image of surgical tool 21 and/or augmented surgical tool 40, and (b) the orientation of the icon is substantially similar to that of the surgical tool. In response to the alignment, the processor engages the control-component tool with the surgical tool.
Typically, the computer processor is configured to position the icon such that when the physician aligns the icon with the surgical tool in the image, the control-component tool itself is disposed relatively centrally within the control-component workspace, and/or the controlcomponent tool can be moved, from this initial position within the control-component workspace, such as to move the surgical tool to any location within the tool workspace, without the controlcomponent tool leaving the control-component workspace.
As described above, it is typically not necessary for the icon to be perfectly aligned with the surgical tool before the engagement. Rather, in some applications, when the alignment is sufficiently close, any slight misalignment of position and/or orientation is maintained, such that the surgical tool does not move at the moment of engagement, and thereafter follows the physician's movements with slight (and typically imperceptible) misalignment. Alternatively, the computer processor drives the robotic unit to adjust the position and/or orientation of the surgical tool such as to complete the alignment.
Typically, once the control-component tool is engaged with the surgical tool, the augmented surgical tool and/or the icon is removed from the image that is displayed on display 24. Following the engagement, movements of the control-component tool by the physician generate corresponding movements of the surgical tool. In other words, rotation of the controlcomponent tool through angular rotations (roll, pitch, and/or yaw) generates corresponding rotations of the surgical tool, and translational movement of the control-component tool (along the X, Y, or Z directions) generates corresponding translational motion of the surgical tool. For some applications, the translational motion of the surgical tool is scaled up or down relative to the translational motion of the control-component tool.
Reference is now made to Figs. 2E-2H, which are schematic illustrations of generally similar steps to those described with reference to Figs. 2A-D being performed with respect to a second surgical tool 21 that is placed superiorly with respect to a patient’s cornea 38, in accordance with some applications of the present disclosure.
As noted above, typically, the view of the patient’s eye that is displayed on the display is as if the eye is being viewed from a superior position relative to the patient’s head, since this is the view that a physician is accustomed to seeing during ophthalmic surgery. Display 24 typically faces the physician’s face and the orientation and position of the icon on display 24 are rotated in accordance with the view of the eye that is shown by the display. Thus, the second surgical tool 21 that is placed superiorly with respect to a patient’s cornea 38 appears at the bottom of the display in the view shown in Fig. 2E.
For some applications, the robotic system is configured to automatically place the second surgical tool in position in the vicinity of the patient’s cornea (e.g., based upon images that are acquired by imaging system 22). Alternatively or additionally, the operator (e.g., nurse 25B) places the second surgical tool in position in the vicinity of the patient’s cornea. Typically, the surgical tool is positioned in the vicinity of the patient’s cornea (by the robotic system and/or by the nurse) at a relatively central position within the tool workspace, and/or such that from this initial position within the tool workspace there is no movement of the control-component tool within the controlcomponent workspace that would result in the tool being moved out of the tool workspace. Thus, starting from this position typically allows completion of a step of the procedure that is to be performed by the second surgical tool, without the need to reposition the second robotic unit during the step of the procedure.
For some applications, an augmented second surgical tool 44 is overlaid upon the image of the second surgical tool in order to facilitate identification of the surgical tool by the physician, as shown in Fig. 2F. Alternatively, an augmented second surgical tool is not displayed. For some applications, the physician provides an input to the computer processor indicating whether or not she/he would like an augmented second surgical tool to be displayed, and the computer processor controls the image that is displayed to the physician based on the input.
In the next step of the engagement process, the computer processor drives display 24 to display a second icon 46 overlaid on the image, as shown in Fig. 2G. Second icon 46 represents the second control-component tool, by virtue of being a virtual representation or an actual image of the second control-component tool.
Next, the physician aligns second icon 46 with surgical tool 21, i.e., the physician moves the second control-component tool such that the second icon is aligned with the image of second surgical tool 21 and/or augmented second surgical tool 44. Typically, the second icon is considered aligned with the second surgical tool when (a) the tip of the second icon overlays the image of second surgical tool 21 and/or augmented second surgical tool 44, and (b) the orientation of the second icon is substantially similar to that of the second surgical tool. In response to the alignment, the processor engages the second control-component tool with the second surgical tool.
Typically, the computer processor is configured to position the second icon such that when the physician aligns the second icon with the second surgical tool in the image, the second controlcomponent tool itself is disposed relatively centrally within the control -component workspace, and/or the second control-component tool can be moved, from this initial position within the control-component workspace, such as to move the second surgical tool to any location within the tool workspace, without the second control-component tool leaving the control-component workspace.
As described above, it is typically not necessary for the second icon to be perfectly aligned with the second surgical tool before the engagement. Rather (as described with reference to the first surgical tool), in some applications, when the alignment is sufficiently close, any slight misalignment of position and/or orientation is maintained, such that the second surgical tool does not move at the moment of engagement, and thereafter follows the physician's movements with slight (and typically imperceptible) misalignment. Alternatively, the computer processor drives the second robotic unit to adjust the position and/or orientation of the second surgical tool such as to complete the alignment.
Typically, once the second control-component tool is engaged with the second surgical tool, the augmented second surgical tool 44 and/or the second icon is removed from the image that is displayed on display 24. Following the engagement, movements of the second controlcomponent tool by the physician generate corresponding movements of the second surgical tool. In other words, rotation of the second control-component tool through angular rotations (roll, pitch, and/or yaw) generates corresponding rotations of the second surgical tool, and translational movement of the second control-component tool (along the X, Y, or Z directions) generates corresponding translational motion of the second surgical tool. For some applications, the translational motion of the second surgical tool is scaled up or down relative to the translational motion of the second control-component tool.
Reference is now made to Fig. 21, which is a schematic illustration of display 24 showing icon 42 being moved away from augmented surgical tool 40 in order to disengage control- component tool 32 from surgical tool 21, in accordance with some applications of the present disclosure.
For some applications, in order to disengage control-component tool 32 from surgical tool 21, the control-component tool is moved toward an edge of the control-component workspace and/or control-component tool is moved such that the surgical tool is moved toward the edge of the tool workspace. For some such applications, the computer processor generates an indication of the disengagement before, during, and/or after the disengagement. For example, graphical elements 48 (e.g., stars, crosses, circles, highlights, or other graphical elements) may be displayed to indicate that the disengagement has happened, is happening, or is about to happen. Alternatively, a circle or an ellipse (not shown) is displayed around the iris (or at a different location), and the computer processor is configured to interpret a portion of the tool (such as the tip) exiting the circle as an indication that the operator wishes to disengage the control-component tool from the surgical tool. For such applications, the size of the circle or ellipse is typically selected such that it is visible within the field of view, but there would typically be no reason to move the portion of the tool outside of the circle for the purpose of the surgery.
Although the disengagement is shown in Fig. 21 with reference to the laterally -placed first surgical tool, typically generally similar steps are performed with reference to a superiorly -placed surgical tool.
As described hereinabove, the operator (e.g., physician 25A) typically assumes and relinquishes control of the surgical tool (via the control-component tool) multiple times during a procedure, especially when the procedure requires the use of multiple tools that are changed during surgery (as is typically the case with ophthalmic procedures, as described above). Typically, over the course of a procedure (a) the operator assumes control of the surgical tool, (b) the operator performs surgical actions with the surgical tool, (c) the operator relinquishes control of the surgical tool, (d) the robotic system or an operator (e.g., a nurse) removes the surgical tool from the robotic unit, (e) the robotic system or the operator places a new surgical tool on the robotic unit, and steps (a)-(e) are repeated. Typically, each time the operator assumes control of a new tool, the steps described with reference to Figs. 2A-D or Figs. 2E-H are performed. Further typically, each time the operator relinquishes control of a tool, the steps described with reference to Fig. 21 are performed.
In general, the primary purpose of the engagement process is to ensure that before the operator begins to control the surgical tool, the control-component tool has approximately the same orientation as the surgical tool. However, the control-component tool need not be at any particular location in the control-component workspace. In fact, sometimes the operator may wish to briefly disengage the control-component tool from the surgical tool, in order to move the control-component tool to a new location within control-component workspace, e.g., for greater comfort. To address this need, in accordance with some embodiments of the present disclosure, the system allows the operator to disengage the control-component tool from the surgical tool even without moving the control-component tool as described above, simply by providing another input. In response to the input, the processor disengages the control-component tool from the surgical tool, such that movement of the control-component tool does not cause any movement of the surgical tool. Subsequently, in response to another input from the operator, the processor re- executes the process for engaging the control-component tool with the surgical tool, which is described above with reference to Figs. 2A-H.
In some embodiments, the operator provides the first input (for disengagement) by pressing an input interface, such as foot pedal 27 or button 33 (Fig. 1 A), and provides the second input (for re-engagement) by releasing the input interface. In other words, the operator continues to press the input interface while moving the control-component tool in its disengaged state. When the operator is ready to re-engage the control-component tool, the operator releases the input interface.
In some embodiments, as further described below with reference to Figs. 4A-D, the control-component tool is re-engaged with the surgical tool after the operator brings the controlcomponent tool, or one of links to which the control-component tool is coupled, into contact with a surface, such that the surface supports the control-component tool or the link when the controlcomponent tool is re-engaged with the surgical tool. The surface then continues to stabilize the control-component tool while the control-component tool is held by the operator.
Typically, once the control-component tool becomes engaged with the surgical tool of the robotic unit, it does not disengage unless the physician performs a disengagement step (e.g., as described with reference to Fig. 21 or by providing another input, such as by pressing an input interface, as described above). However, it is noted that in some cases the control-component tool is disengaged from the surgical tool during a procedure, even without an input from the physician. For example, the control-component tool is disengaged from the surgical tool based on the robotic system detecting an imminent collision of tool with each other or with a portion of the patient’s body. Typically, in the event that the control-component tool is disengaged from the surgical tool while the surgical tool is disposed within the tool workspace the physician re-engages the controlcomponent tool with the surgical tool by performing the steps described hereinabove.
Reference is now made to Fig. 3 A, which is a schematic illustration of the robotic system annotated with cuboids that indicate a control-component workspace 60 and a tool workspace 62 for illustrative purposes, in accordance with some applications of the present disclosure. As described hereinabove, typically, the control of the surgical tool by the operator has limitations. For example, control-component workspace 60 (i.e., the workspace in which the operator can move the control-component tool) is typically physically constrained by where it is comfortable or even possible for the operator to move the control-component tool. In addition, the tool workspace 62 (i.e., the workspace of the surgical tool) is typically physically constrained by the space within which it is possible for the robotic arm to move the surgical tool.
Control-component workspace 60 should be such that the control-component tool has sufficient freedom of movement such as to have the ability to control movement of the surgical tool within the surgical space. If the operator assumes control of the surgical tool (via the controlcomponent tool) when the control-component tool is close to the edge of the control-component workspace, the movement of the control-component tool (and therefore that of the surgical tool) will be limited. Therefore, it is typically preferable for the operator to engage the controlcomponent tool with the surgical tool when the control-component tool is positioned and oriented such that the operator has good freedom of movement of the control-component tool.
Tool workspace 62 should ideally cover the space within which the tool is expected to be manipulated for the purpose of the surgery (hereinafter “the surgical space”). If the operator assumes control of the surgical tool (via the control-component tool) when the surgical tool is at the edge of the tool workspace, the movement of the surgical tool will be limited. Therefore, it is typically preferable for the operator to engage the control-component tool with the surgical tool when the surgical tool is positioned and oriented such that it has good freedom of movement.
In other words, the operator should be able to freely move the surgical tool to all positions and orientation within the surgical space, without the control-component tool reaching the limits of the control-component workspace and without the surgical tool reaching the limits of the tool workspace. In accordance with some applications of the present disclosure, the control-component tool becomes engaged with the surgical tool, when the surgical tool and the control-component tool are toward the centers of the control-component workspace and the tool workspace, respectively.
For some applications, the control-component workspace has different dimensions from the tool workspace. For example, the tool workspace may be smaller than the control-component workspace, as shown. Typically, for such applications, movement of surgical tool 21 by robotic unit 20 are scaled down relative to movements of control-component tool 32. Reference is now made to Fig. 3B, which is a schematic illustration of robotic system 10 annotated with several frames of references for illustrative purposes, in accordance with some applications of the present disclosure. Typically, there are several frames of reference that are operating within the robotic system and the computer processor transforms movements between the frames of reference. An example of this is now described with reference to Fig. 3B.
In the example shown in Fig. 3B, the display displays an image that corresponds to “superior” surgery - with the image that is shown to the physician being as if the physician is facing the patient’s face, such that the patient’s chin faces up and forehead faces down. Typically, in this configuration, when the physician moves the right control -component tool in the direction of his/her right hand within the control-component frame of reference F2, the right robotic unit moves surgical tool to the left (within right robotic unit frame of reference F5), and the surgical tool is moved to the right on display 24 (within display frame of reference Fl). The display displays images captured by imaging system, with such images being acquired within the imaging system frame of reference F3. Similarly, movement of the left control-component tool causes the left robotic unit to move the left surgical tool (within left robotic unit frame of reference F4).
Typically, the control-component unit 26 is physically attached to the same body as display 24, such that there is a rigid and constant transformation from the control-component frame of reference F2 to the display frame of reference Fl. The imaging system frame of reference F3 can be moved with respect to the patient. For example, the imaging system can be rotated such that it is as if the physician is viewing the eye with the superior and lateral (i.e., temporal) directions being reversed. Typically, whatever imaging system frame of reference is used, movements of the control-component tools within frame of reference F2 generates movement of the surgical tools in the same direction within display frame of reference Fl.
To achieve proper transformations of movements between the frames of reference, the computer processor typically receives inputs that are indicative of the orientations of the various frames of reference relative to each other. For some applications, the computer processor analyzes images of the robotic units within images acquire by the imaging system in order to determine the orientations of the various frames of reference relative to each other. Alternatively or additionally, the computer processor receives an input from the operator indicating the orientation of surgery that she/he would like to be displayed on the display.
As described herein above, typically, the right control component controls movement of the surgical tool that is toward the right of the patient’s head when viewing the patient from a superior position (and which would normallv be controlled by the physician’s right hand), and the left control component controls movement of the surgical tool that is toward the left of the patient’ s head when viewing the patient from a superior position (and which would normally be controlled by the physician’s left hand). However, in some cases, the right control component controls movement of the surgical tool that is toward the left of the patient’s head when viewing the patient from a superior position (and which would normally be controlled by the physician’s left hand), and the left control component controls movement of the surgical tool that is toward the right of the patient’s head when viewing the patient from a superior position (and which would normally be controlled by the physician’s right hand). For example, in cases in which it would it is more intuitive to the physician and/or less physically cumbersome for the physician to use the left control-component tool to control the right surgical tool, and vice versa, rather than the other way round (e.g., based on the positions of the surgical tools and/or the handedness of the physician), the physician may switch which control component controls which tool. Alternatively or additionally, the computer processor or the physician may determine that it is easier for the surgical tools to perform their designated functions while staying within their ranges of motion by using the left control-component tool to control the right surgical tool and vice versa, and the computer processor may drive the robotic units to function accordingly (based upon the automatic detection by the computer processor or based upon an input from the physician). Typically, in such cases, the computer processor converts inputs from the physician regarding movements and actions that are provided within the right control-component tool frame of reference to corresponding movements and actions of the left surgical tool by the left robotic unit within the left robotic unit frame of reference, and/or vice versa.
In some cases, the computer processor or the physician determines that it will be easier for one of the surgical tools to perform its designated function by being inserted into the eye from an inferior position (i.e., by being inserted from above the patient’s cheek). For some such applications, based upon the automatic detection by the computer processor or based upon an input from the physician indicating that this is the case, the computer processor drives a selected one of the robotic units to insert the surgical tool from the inferior position. The computer processor receives inputs from the physician regarding movements and actions to perform using the inferiorly-inserted surgical tool, based on the operator controlling the control-component tool of the corresponding control component while the control-component is disposed at its regular orientation. Typically, in such cases, the computer processor converts inputs from the physician regarding movements and actions that are provided within either the right or left controlcomponent tool frame of reference to corresponding movements and actions of the inferiorly- inserted surgical tool by the selected robotic unit within the frame of reference of the selected robotic unit.
For some applications, the robotic surgery is performed by forming one incision in the patient’s cornea at a circumferential position that is not a standard position for incisions that are made during typical cataract surgery. For example, an incision may be made at a non-standard circumferential position around the patient’s cornea, in order to correct an astigmatism of the patient, at the same time as providing an insertion point (or region) for the surgical tool(s). Such incisions may include limbal relaxing incisions and/or a clear corneal incision, both of which are techniques that are known in the art and are typically customized to the corneal topography of the particular patient. For example, a clear corneal incision is typically aligned with the steep axis of cornea. Typically, when such incisions are made and are subsequently used for the insertion of tools during a cataract (or different ophthalmic surgical) procedure, the tools are inserted and operated from non-standard positions. However, the control-component units typically remain in their regular positions with respect to the imaging system and/or other components of the system (e.g., with respect to each other).
For some such applications, based upon the automatic detection by the computer processor or based upon an input from the physician indicating that an incision has been made at nonstandard positions, the computer processor drives a selected one of the robotic units to insert its surgical tool(s) from the non-standard position. The computer processor receives inputs from the physician regarding movements and actions to perform using the non-standardly-inserted surgical tool, based on the operator controlling the control-component tool of a selected one of the control components while the control-component is disposed at its regular position. Typically, in such cases, the computer processor converts inputs from the physician regarding movements and actions that are provided within the selected control-component tool frame of reference to corresponding movements and actions of the non-standardly-inserted surgical tool by the selected robotic unit within the frame of reference of the selected robotic unit.
Reference is now made to Figs. 4A, 4B, 4C, and 4D, which are schematic illustrations of control component 30, in accordance with some applications of the present disclosure. Fig. 4A and 4B show respective oblique views of the control component, Fig. 4C shows a side view, and Fig. 4D shows a top view.
Control component 30 includes control-component tool 32 and multiple links 54 to which the control-component tool is coupled. Links 54 are coupled to each other via one or more joints 47, such that the links provide multiple decrees of freedom to the control -component tool. For example, in some embodiments, links 54 provide six degrees of freedom, including three translational degrees of freedom and three rotational degrees of freedom, to the control-component tool.
In some embodiments, links 54 include a frame 50, which in some embodiments includes two curved arms. Frame 50 is configured to rotate around a first rotational axis 52X. Links 54 further include a shaft 53, to which control-component tool 32 is mounted, and one or more supporting links 55, e.g., a pair of parallel supporting links 55, which couple shaft 53 to frame 50 and rotate around a second rotational axis 52Y and around a third rotational axis 52Z. As the operator moves the control-component tool within the X- Y plane, frame 50 rotates about rotational axis 52X and supporting links 55 rotate about rotational axis 52Y. As the operator moves the control-component tool along the Z linear direction, supporting links 55 rotate about rotational axis 52Z.
Typically, control-component tool 32 is moveable by the operator to undergo pitch, yaw, and roll angular rotations. The control-component tool undergoes pitch angular rotation by rotating about a pitch rotational axis 70 at the joint 47 between the control-component tool and shaft 53, yaw angular rotation by shaft 53 rotating about its own axis 72 (which functions as the yaw rotational axis), and roll angular rotation by rotating about its own longitudinal axis 74 (which functions as the roll rotational axis).
Typically, control component 30 includes at least three rotary encoders, which are disposed at different respective joints 47. The rotary encoders detect the rotation of the links about the rotational axes, and generate signals in response thereto. Alternatively or additionally, an inertial measurement unit 76 is housed within the control-component tool. Typically, the inertial measurement unit includes a three-axis accelerometer, a three-axis gyroscope, and/or a three-axis magnetometer. The inertial measurement unit generates an inertial-measurement-unit signal describing the three-dimensional orientation of the control-component tool. Alternatively or additionally, the control component includes one or more additional rotary encoders to detect the roll, pitch and/or yaw orientation of control-component tool 32. Computer processor 28 (Fig. 1A) receives the rotary-encoder signals and the inertial-measurement-unit signal, and computes the XYZ location and orientation of the tip 58 of control-component tool 32 based on these signals.
In some cases, the operator may wish to stabilize the control-component tool such that a greater amount of force is required to move the control-component tool, relative to the amount of force that is usually required. For example, while one hand of the operator moves one of the control-component tools, it may be difficult for the onerator to avoid accidentally moving the other control-component tool, which is held by the other hand of the operator. Alternatively or additionally, it may be difficult for the operator to avoid accidentally moving a control-component tool held by the operator’s non-dominant hand.
To address this challenge, some embodiments allow the operator to provide an input, e.g., by pressing an input interface, such as foot pedal 27 or button 33 (Fig. 1 A). In response to receiving the input, the processor increases the resistance of the control-component tool to forces applied to the control-component tool by the operator, without disallowing movement of the controlcomponent tool. Thus, the control-component tool is stabilized against accidental movement, yet by applying enough force, the operator can still move the control-component tool if necessary.
Typically, respective motors are operatively coupled to joints 47, and the processor increases the resistance to the forces applied by the operator by increasing the (translational and/or rotational) counterforce applied to the joints by the motors. Typically, the motors are direct-drive motors (i.e., motors that do not impart motion via gear wheels). In addition, typically, the motors are linear motors, e.g., linear voice coil motors.
For example, Figs. 4A-D show control component 30 including motors 56X, 56Y, and 56Z. Motor 56X (and, optionally, an extension 56XE thereof) is coupled to an angled extension 50E of frame 50. Motor 56Y (and, optionally, an extension 56YE thereof) passes between the two curved arms of frame 50, such that motor 56Y (and, optionally, extension 56YE) is coupled to an angled extension 55E of a supporting link 55. Motor 56Z is disposed at the end of supporting links 55 that is opposite from shaft 53.
In other cases, the operator may wish to limit (e.g., disallow) at least one type of movement of the surgical tool, without limiting other types of movement of the surgical tool. For example, the operator may wish to inhibit the surgical tool from moving deeper into the eye (or, for other types of surgery, any other portion of the body) or from moving into the edge of the corneal incision (or, for other types of surgery, any other incision in the body), but still allow the surgical tool to translate in other directions and/or to rotate.
To address this challenge, some embodiments allow the operator to provide an input, e.g., by pressing an input interface, such as foot pedal 27 or button 33 (Fig. 1A). The processor is configured to receive the input, and to interpret the input as indicating that at least one type of movement of the surgical tool should be limited, without other types of movement of the surgical tool being limited. In response to the input, the processor limits (e.g., disallows) the movement of the control-component tool with respect to the type of movement but not with respect to the other types of movement.
For example, in some embodiments, the processor limits the movement of the controlcomponent tool with respect to at least one of the multiple (e.g., six) degrees of freedom of the control-component tool. As a specific example, in some embodiments, the processor limits all translational movement (i.e., limits the movement with respect to all three translational degrees of freedom) without limiting any rotational movement of the control-component tool.
In some embodiments, the point along a surgical tool via which a tool is inserted into the eye via a corneal incision functions as a virtual pivot point, in that rotational movements of the corresponding control tool cause the surgical tool to pivot with respect to this point (in order to prevent tearing of the corneal incision), as described in co-assigned US Patent Application Publication 2023/0240779 to Golan, whose disclosure is incorporated herein by reference. (The aforementioned reference uses the term “remote center of motion” in place of “virtual pivot point.”) In some embodiments, the processor limits the movement of the control-component tool with respect to at least one of the multiple (e.g., six) degrees of freedom of the control-component tool in order to help the operator maintain the pivot point about which the surgical tool rotates.
In some embodiments, the processor limits translational movement of the controlcomponent tool in at least one direction without limiting any rotational movement of the controlcomponent tool. For example, as described immediately above, in some embodiments, the processor limits all translational movement. Alternatively, for example, the processor limits translational movement along only one or two of the X-, Y-, and Z-axes. Alternatively or additionally, the processor limits translational movement in only one direction along any of the X-, Y-, and Z-axes.
Alternatively, rather than being along the X-, Y-, or Z-axis, the limited direction is defined with respect to the orientation of the control-component tool. For example, in some embodiments, the processor limits translational movement of the control-component tool along longitudinal axis 74 of the control-component tool (at least in the direction of tip 58) or perpendicular to axis 74. This can help the operator, for example, to avoid moving the surgical tool deeper into the eye or into the edge of the corneal incision.
Typically, the processor limits the movement of the control-component tool by increasing the (translational and/or rotational) counterforce applied to the rotational joints by the motors, as described above.
The processor is configured to engage control-component tool 32 with the surgical tool, using the method described above with reference to Figs. 2A-H or any other suitable method. Following the engagement, movement of the control-component tool by the operator causes corresponding movement of the surgical tool by the robotic unit. As described above in the context of Fig. 21, in some embodiments, the processor is further configured to disengage controlcomponent tool 32 from the surgical tool in response to a first input from the operator, such that movement of the control-component tool does not cause any movement of the surgical tool. Following the disengagement, in some embodiments, the operator brings the control-component tool or one of links 54 into contact with a surface, such as a horizontal surface, such that the surface supports the control-component tool or the link. For example, in some embodiments, the operator brings shaft 53 into contact with the surface 78 to which links 54 are mounted.
Next, in response to a second input from the operator, the processor re-engages the controlcomponent tool with the surgical tool, using the method described above with reference to Figs. 2A-H or any other suitable method. Advantageously, the surface then stabilizes the controlcomponent tool while the control-component tool is held by the operator. This technique for stabilizing the control-component tool can be used alternatively or additionally to increasing the resistance of the control-component tool to forces applied to the control-component tool by the operator.
Even more advantageously, some embodiments allow manipulation of the controlcomponent tool following the re-engagement, i.e., the processor drives the robotic unit to move the surgical tool correspondingly to movement of the control-component tool while the surface supports the control-component tool or the link. Typically, in such embodiments, the movement of the control-component tool is not constrained by the surface even while the surface supports the control-component tool or the link, such that the operator retains full control over the surgical tool.
For example, in some embodiments, as shown in Fig. 4C, the element of control component 30 that is supported by the surface (i.e., the control-component tool or one of links 54) includes a compression spring 79 and a retractable portion 80 coupled to compression spring 79. The surface supports the element at retractable portion 80, such that the retractable portion retracts and spring 79 compresses, as indicated by a retraction indicator 81, when the element is pushed against the surface, as indicated by a pushing indicator 82. Thus, even movement into the surface is not constrained by the surface.
For example, for some embodiments in which control-component tool 32 is mounted to the top end of shaft 53, retractable portion 80 is nositioned at the bottom end of shaft 53. Alternatively, retractable portion 80 is positioned at tip 58 of the control-component tool. In some such embodiments, tip 58 is supported by a surface 86 or 87 of workstation 84 (Fig. 5A).
In some embodiments, alternatively or additionally to control component 30 including compression spring 79, the surface is mounted to a compression spring such that the surface retracts, and the spring compresses, when the element of control component 30 is pushed against the surface.
Reference is now additionally made to Figs. 5 A and 5B, which are schematic illustrations of workstation 84, in accordance with some embodiments of the present disclosure. Fig. 5B shows the workstation with surface 87 removed, such that links 54 and surface 78, upon which the links are mounted, are visible. In use, surface 87 typically covers surface 78 and links 54, while shafts 53 support respective control-component tools 32 such that the control-component tools hover above surface 86 and surface 86.
As described above with reference to Figs. 1A-B, workstation 84 typically includes control components 30 and at least one display 24. The operator sits (or stands) behind the control components, such that the control components are between the operator and the display, while facing the display. In some embodiments, workstation 84 further includes a respective docking station 88 for each control-component tool 32. When the control component is not in use, the operator docks the control-component tool in docking station 88. The lower enlarged frame in Fig. 5A shows one of the control-component tools in its docked position.
In some embodiments, while the element of the control component is supported by the surface (e.g., surface 78), the hand of the operator, which holds control-component tool 32, is supported by another surface, such that even greater stability of the control-component tool is attained. For example, in some embodiments, workstation 84 includes a surface 86 positioned behind the control components. The operator sits (or stands) behind surface 86 and reaches over surface 86 to hold the control-component tools, such that surface 86 supports the operator’s hands.
As described above, inertial measurement unit 76 typically includes a three-axis accelerometer, a three-axis gyroscope, and/or a three-axis magnetometer. The accelerometers directly measure acceleration, the gyroscopes directly measure angular velocity, and the magnetometers measure magnetic fields. There are three of each type of sensor disposed in an orthogonal configuration with respect to each other, thus allowing 3D acceleration, angular velocity, and magnetic field sensing.
In some embodiments, the combination of the above measurements is used to infer the orientation of the inertial-measurement unit. Specifically, algorithms generally use the earth’s gravitational pull as a known acceleration, which is fused with the gyroscope measurements integrated over time to infer the inertial measurement unit’s orientation. Without continuous correction of the orientation by the gravitational acceleration vector, the orientation output for the inertial measurement unit tends to drift. This is because angular information is derived from the gyroscopes through numerical integration. Any error in the angular velocity measurement accumulates over time, eventually resulting in a very poor estimate of the true orientation. The gravitational acceleration creates a “ground truth” that is used to remove the drift. If the gravitational vector did not change orientation, a change in angle derived from the gyroscopes can be ignored.
However, there is typically a possible rotation about the gravitational vector that cannot be sensed by the accelerometers. Since a rotation of the inertial measurement unit about the gravitational vector cannot be sensed by accelerometers, it can drift. One way to solve this drift is to fuse information from magnetometers, which provide a second “ground truth” vector that is linearly independent of the gravitational pull, i.e., magnetic North. However, this reading may be affected by other magnetic fields around the inertial measurement unit, which may cause errors in this reading.
Hence, in accordance with some applications of the present disclosure, a “ground truth” vector is derived that is linearly independent from the gravitational vector. Typically, the controlcomponent tools are docked in a given predetermined orientation, which is typically not vertical. For some applications, a sensor 92 (such as a switch, a photo-reflector, etc.) identifies when the control-component tool is docked. When the control-component tool is docked, the computer processor recalibrates the inertial measurement unit, based on two ground truth vectors - the gravitational vector and the known orientation of the control-component tool. Thus, each time the control-component tool is docked, the inertial measurement unit is recalibrated such that any drift is corrected.
For some applications, the computer processor performs the recalibration of the inertial measurement unit using the following algorithm. When the control-component tool is detected as being docked, the inertial measurement unit transmits the roll axis position that it is detecting to the computer processor. The roll axis is the axis that typically drifts due to lack of gravitational information. The roll axis position is cast onto the horizontal plane (normal to gravity), and compared to the known, true tilt angle; that is, the angle at which the control-component tool is known to lie within the horizontal plane. Whatever difference there is between the measured orientation and the true orientation is subtracted from the measured orientation. Effectively, the inertial measurement unit’s measurement is corrected to fit the true orientation the controlcomponent tool is known to be in, and the correction is maintained until the next time the controlcomponent tool is docked, at which point the correction is repeated.
Reference is now made to Fig. 6, which schematically illustrates use of a tool-type- specific transformation, in accordance with some embodiments of the present disclosure.
Fig. 6 shows a straight surgical tool 21a and a non-straight surgical tool 21b. In surgical tool 21a, the functional portion of the tool, which includes the tip 94a of the tool, is generally parallel to the handle of the tool. In contrast, in non-straight surgical tool 21b, the functional portion of the tool, which includes the tip 94b of the tool, is not parallel to the handle of the tool.
As described above, the processor is configured to drive the robotic unit to move the surgical tool held by the robotic unit correspondingly to movement of control-component tool 32 by an operator. Specifically, the processor continually applies a transformation, such as an affine transformation, to the coordinates of the control-component tool, to compute corresponding coordinates of the surgical tool. The processor then drives the robotic unit to move the surgical tool to the corresponding coordinates. Typically, the transformed coordinates are those of tip 58, and the corresponding coordinates are those of the tip of the surgical tool. Alternatively, the transformed coordinates are those of another portion of the control-component tool, and the corresponding coordinates are those of another, corresponding portion of the surgical tool.
In many cases, surgical tools of different types are preferably held at different orientations, e.g., at different pitches, during a surgical procedure. For example, in many cases, non-straight surgical tools are preferably held at a greater pitch than straight tools, i.e., the angle of the handle of the tool with respect to the horizontal is preferably greater for non-straight surgical tools, relative to straight surgical tools. For example, Fig. 6 shows surgical tool 21a held at a pitch of 91, and surgical tool 21b held at a pitch of 92 > 91.
Furthermore, the operator typically holds control-component tool 32 at the preferred orientation of the surgical tool controlled by the control-component tool. For example, Fig. 6 shows the control-component tool held at pitch 91 when controlling tool 21a, and at pitch 92 when controlling tool 21b. Thus, whenever one surgical tool is replaced with another surgical tool having a different preferred orientation, the operator changes the orientation (e.g., the pitch) of the controlcomponent tool. However, given that the hand position of the operator typically does not change significantly (e.g., given that the operator’s hand continues to rest on surface 86 (Fig. 5 A)), this change in orientation typically causes the position of tip 58 (or alternatively, another portion of the control-component tool whose coordinates are transformed) to change. If the transformation were to remain the same, the corresponding coordinates of the surgical tool would undesirably change. For example, when switching from straight surgical tool 21a to non-straight surgical tool 21b, the height of tip 58 decreases, such that the corresponding height of tip 94b would also decrease. Conversely, when switching from non-straight surgical tool 21b to straight surgical tool 21a, the height of tip 58 increases, such that the corresponding height of tip 94b would also increase.
To address this challenge, in some embodiments, the processor is configured to use different transformations for different tool types. In particular, the processor is configured to identify, automatically or in response to a manual input, the type of surgical tool held by the robotic unit. In response to identifying the type, the processor selects a transformation from multiple predefined transformations, which may be affine or may be of any other suitable type. The processor then drives the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by the operator, by applying the selected transformation to the coordinates of the control-component tool to compute the corresponding coordinates of the surgical tool. The predefined transformations are configured to minimize the change in the corresponding coordinates of the surgical tool, by mapping different respective coordinates of the control-component tool to the same coordinates of the surgical tool.
Typically, the predefined transformations include respective translations that vary from each other, e.g., along the vertical axis. For example, as noted above, in some embodiments, the processor applies an affine transformation Az + b to the coordinate vector z of the controlcomponent tool, where A is an invertible matrix and b is a translation vector. Typically, whereas A is the same for all the predefined transformations, b is unique for each predefined transformation.
For example, Fig. 6 shows the processor using a first transformation T1 for straight surgical tool 21a, but a different, second transformation T2 for non-straight surgical tool 21b. First transformation T1 has a greater vertical translation, relative to second transformation T2. By virtue of these different transformations, the different coordinates of tip 58 shown in Fig. 6 are mapped to the same corresponding coordinates for the two tools, such that tip 94b coincides with tip 94a.
Typically, the predefined transformations correspond to different respective orientations, e.g., pitches, at which surgical tools are held during the surgical procedure. For example, in Fig. 6, first transformation T1 corresponds to pitch 01 whereas second transformation T2 corresponds to pitch 92.
As illustrated in Fig. 6, in some embodiments, the processor selects a different one of the predefined transformations in response to the surgical tool being of a non-straight type, relative to the surgical tool being of a straight type, due to the different preferred orientations for these types. In the context of ophthalmic surgery, examples of non-straight types include an ophthalmic chopper, and forceps. Additional examples of examples of non-straight tool types include a keratome blade, a paracentesis knife, and/or a syringe (e.g., a dispersive ophthalmic viscosurgical device (OVD) syringe, a cohesive ophthalmic viscosurgical device (OVD) syringe, a staining syringe, a lidocaine syringe, a hydrodissection syringe, and/or an antibiotics syringe). Examples of straight types include an intraocular injector and a phacoemulsification handpiece.
For further details in this regard, reference is now made to Fig. 7, which shows a flow diagram 96 for an algorithm executed by processor 28 (Figs. 1A-B) to control the surgical tools using tool-type-specific transformations, in accordance with some embodiments of the present disclosure.
At a first moving step 98, the processor moves a first surgical tool using a first predefined transformation, i.e., the processor drives the robotic unit to move the first surgical tool correspondingly to the movement of the control-component tool, by applying a first transformation to the coordinates of the control-component tool to compute the corresponding coordinates of the first surgical tool. Subsequently, when the operator provides the necessary input, the processor disengages the control-component tool from the first surgical tool, e.g., as described above with reference to Fig. 21, at a disengaging step 100, such that the movement of the control-component tool does not cause any movement of the first surgical tool.
Next, a second surgical tool replaces the first surgical tool while the control-component tool is disengaged from the first surgical tool. The processor identifies the type of the second surgical tool, at a type-identifying step 102, in response to the second surgical tool replacing the first surgical tool. (As noted above, this identification can be automatic or in response to a manual input.) Subsequently, the processor, at a transformation-selecting step 104, selects a second predefined transformation in response to the type of the second surgical tool. The processor then engages the control-component tool to the second surgical tool, e.g., as described above with reference to Figs. 2A-H, at an engaging step 106. Subsequently, at a second moving step 108, the processor moves the second surgical tool using the second predefined transformation, i.e., the processor drives the robotic unit to move the second surgical tool correspondingly to the movement of the control-component tool, by annlvine the second transformation to the coordinates of the control-component tool to compute the corresponding coordinates of the second surgical tool.
Advantageously, the application of the second transformation reduces the required movement of the control-component tool following the second surgical tool replacing the first surgical tool, relative to if the first transformation were applied following the second surgical tool replacing the first surgical tool. For example, it will be supposed that the first surgical tool is straight and the second surgical tool is non-straight, or vice versa. If the same transformation were applied for both tools, the control-component tool might need to be moved a significant vertical distance following the replacement, as described above with reference to Fig. 6. On the other hand, by virtue of using different transformations, this required distance is reduced.
Although some applications of the present disclosure are described with reference to cataract surgery, the scope of the present application includes applying the apparatus and methods described herein to other medical procedures, mutatis mutandis. In particular, the apparatus and methods described herein to other medical procedures may be applied to other microsurgical procedures, such as general surgery, orthopedic surgery, gynecological surgery, otolaryngology, neurosurgery, oral and maxillofacial surgery, plastic surgery, podiatric surgery, vascular surgery, and/or pediatric surgery that is performed using microsurgical techniques. For some such applications, the imaging system includes one or more microscopic imaging units.
It is noted that the scope of the present application includes applying the apparatus and methods described herein to intraocular procedures, other than cataract surgery, mutatis mutandis. Such procedures may include collagen crosslinking, endothelial keratoplasty (e.g., DSEK, DMEK, and/or PDEK), DSO (descemet stripping without transplantation), laser assisted keratoplasty, keratoplasty, LASIK/PRK, SMILE, pterygium, ocular surface cancer treatment, secondary IOL placement (sutured, transconjunctival, etc.), iris repair, IOL reposition, IOL exchange, superficial keratectomy, Minimally Invasive Glaucoma Surgery (MIGS), limbal stem cell transplantation, astigmatic keratotomy, Limbal Relaxing Incisions (LRI), amniotic membrane transplantation (AMT), glaucoma surgery (e.g., trabs, tubes, minimally invasive glaucoma surgery), automated lamellar keratoplasty (ALK), anterior vitrectomy, and/or pars plana anterior vitrectomy.
Applications of the disclosure described herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium (e.g., a non-transitory computer-readable medium) providing program code for use by or in connection with a computer or any instruction execution system, such as computer processor 28. For the purpose of this description, a computer-usable or computer readable medium can be any apparatus that can include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Typically, the computer-usable or computer readable medium is a non- transitory computer-usable or computer readable medium.
Examples of a computer-readable medium include a semiconductor or solid-state memory, magnetic tape, a removable computer diskette, a random-access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W), DVD, and a USB drive.
A data processing system suitable for storing and/or executing program code will include at least one processor (e.g., computer processor 28) coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments of the disclosure.
Network adapters may be coupled to the processor to enable the processor to become coupled to other processors or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
Computer program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the C programming language or similar programming languages.
It will be understood that the algorithms described herein, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer (e.g., computer processor 28) or other programmable data processing apparatus, create means for implementing the functions/acts specified in the algorithms described in the present application. These computer program instructions may also be stored in a computer-readable medium (e.g., a non-transitory computer-readable medium) that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the algorithms. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the algorithms described in the present application.
Computer processor 28 is typically a hardware device programmed with computer program instructions to produce a special purpose computer. For example, when programmed to perform the algorithms described with reference to the Figures, computer processor 28 typically acts as a special purpose robotic-system computer processor. Typically, the operations described herein that are performed by computer processor 28 transform the physical state of a memory, which is a real physical article, to have a different magnetic polarity, electrical charge, or the like depending on the technology of the memory that is used. For some applications, operations that are described as being performed by a computer processor are performed by a plurality of computer processors in combination with each other.
It will be appreciated by persons skilled in the art that the present disclosure is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present disclosure includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof that are not in the prior art, which would occur to persons skilled in the art upon reading the foregoing description.

Claims

1. An apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the apparatus comprising: a control-component tool; and a processor, configured to: execute a process for engaging the control-component tool with the surgical tool such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit, by: displaying an image that includes a representation of the surgical tool, overlaying, on the image, an icon representing the control-component tool, such that a location and orientation of the icon tracks a location and orientation of the control-component tool, and in response to an alignment, by the operator, of the icon with the representation of the surgical tool in the image, engaging the control-component tool with the surgical tool, in response to a first input from the operator, disengage the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool, and in response to a second input from the operator, re-execute the process for engaging the control-component tool with the surgical tool.
2. The apparatus according to claim 1, wherein the processor is configured to display the representation of the surgical tool by displaying an icon representing the surgical tool.
3. The apparatus according to claim 1, wherein the processor is configured to display the representation of the surgical tool by displaying an image of the surgical tool.
4. The apparatus according to claim 1, wherein the processor is configured to display the representation of the surgical tool by displaying an image of the portion of the body and the surgical tool.
5. The apparatus according to claim 1, wherein the surgical tool includes an ophthalmic surgical tool.
6. The apparatus according to claim 1, further comprising multiple links that are coupled to each other via one or more joints and are coupled to the control-component tool, wherein the processor is configured to re-execute the process for engaging the control- component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface, such that the surface supports the control-component tool or the one of the links when the control-component tool is re-engaged with the surgical tool.
7. The apparatus according to any one of claims 1-6, further comprising an input interface, wherein the operator provides the first input by pressing the input interface and provides the second input by releasing the input interface.
8. The apparatus according to claim 7, wherein the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
9. The apparatus according to claim 7, wherein the input interface comprises a button on the control-component tool.
10. An apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the apparatus comprising: multiple links coupled to each other via one or more joints; a control-component tool coupled to the links; and a processor, configured to: engage the control-component tool with the surgical tool, such that movement of the control-component tool by an operator causes corresponding movement of the surgical tool by the robotic unit, in response to a first input from the operator, disengage the control-component tool from the surgical tool, such that the movement of the control-component tool does not cause any movement of the surgical tool, and in response to a second input from the operator, re-engage the control-component tool with the surgical tool after the operator brings the control-component tool or one of the links into contact with a surface such that the surface supports the control-component tool or the one of the links.
11. The apparatus according to claim 10, wherein the surface is horizontal.
12. The apparatus according to claim 10, wherein the surface supports the control-component tool or the one of the links while a hand of the operator, which holds the control-component tool, is supported by another surface.
13. The apparatus according to any one of claims 10-12, wherein the processor is further configured to drive the robotic unit, following the re-engagement of the control-component tool with the surgical tool, to move the surgical tool correspondingly to the movement of the controlcomponent tool while the surface supports the control-component tool or the one of the links.
14. The apparatus according to claim 13, wherein the movement of the control-component tool is not constrained by the surface even while the surface supports the control-component tool or the one of the links.
15. The apparatus according to claim 14, wherein the control-component tool or the one of the links comprises a compression spring and a retractable portion coupled to the compression spring, and wherein the surface supports the control-component tool or the one of the links at the retractable portion, such that the retractable portion retracts, and the spring compresses, when the control-component tool or the one of the links is pushed against the surface.
16. The apparatus according to claim 15, wherein the control-component tool is mounted to a top end of the one of the links, and wherein the retractable portion is positioned at a bottom end of the one of the links.
17. The apparatus according to claim 16, further comprising the surface, wherein the links are mounted to the surface.
18. The apparatus according to claim 15, wherein the retractable portion is positioned at a tip of the control-component tool.
19. The apparatus according to claim 14, further comprising the surface and a compression spring, wherein the surface is mounted to the compression spring such that the surface retracts, and the spring compresses, when the control-component tool or the one of the links is pushed against the surface.
20. An apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the apparatus comprising: a control-component tool; and a processor, configured to: identify a type of the surgical tool, in response to identifying the type, select a transformation from multiple predefined transformations, and drive the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by an operator, by applying the selected transformation to coordinates of the control-component tool to compute corresponding coordinates of the surgical tool.
21. The apparatus according to claim 20, wherein the surgical tool includes an ophthalmic surgical tool.
22. The apparatus according to claim 20, wherein the predefined transformations include respective translations that vary from each other.
23. The apparatus according to claim 22, wherein the translations vary from each other along a vertical axis.
24. The apparatus according to claim 20, wherein the processor is configured to apply the selected transformation to coordinates of a tip of the control-component tool to compute corresponding coordinates of a tip of the surgical tool.
25. The apparatus according to claim 20, wherein the predefined transformations map different respective coordinates of the control-component tool to the same coordinates of the surgical tool.
26. The apparatus according to claim 20, wherein the predefined transformations correspond to different respective orientations at which surgical tools are held during the surgical procedure.
27. The apparatus according to claim 26, wherein the predefined transformations correspond to different respective pitches at which surgical tools are held during the surgical procedure.
28. The apparatus according to any one of claims 20-27, wherein the processor is configured to select a different one of the predefined transformations in response to the surgical tool being of a non-straight type, relative to the surgical tool being of a straight type.
29. The apparatus according to claim 28, wherein the non-straight type is selected from the group of types consisting of: an ophthalmic chopper, and forceps.
30. The apparatus according to claim 28, wherein the straight type is selected from the group of types consisting of: an intraocular injector, and a phacoemulsification handpiece.
31. The apparatus according to any one of claims 20-27, wherein the processor is further configured to, prior to identifying the type of the surgical tool: drive the robotic unit to move another surgical tool correspondingly to the movement of the control-component tool, by applying another one of the predefined transformations to the coordinates of the control-component tool to compute the corresponding coordinates of the surgical tool, and disengage the control-component tool from the other surgical tool, such that the movement of the control-component tool does not cause any movement of the other surgical tool, and wherein the processor is configured to identify the type of the surgical tool in response to the surgical tool replacing the other surgical tool while the control-component tool is disengaged from the other surgical tool.
32. The apparatus according to claim 31 , wherein the application of the selected transformation reduces required movement of the control-component tool following the surgical tool replacing the other surgical tool, relative to if the other one of the predefined transformations were applied following the surgical tool replacing the other surgical tool.
33. An apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the apparatus comprising: a control-component tool; and a processor, configured to: drive the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by an operator, receive an input from the operator, and in response to the input, increase a resistance of the control -component tool to forces applied to the control-component tool by the operator, without disallowing the movement.
34. The apparatus according to claim 33, further comprising an input interface, wherein the operator provides the input by pressing the input interface.
35. The apparatus according to claim 34, wherein the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
36. The apparatus according to claim 34, wherein the input interface comprises a button on the control-component tool.
37. The apparatus according to claim 33, wherein the surgical tool includes an ophthalmic surgical tool.
38. The apparatus according to any one of claims 33-37, wherein the control-component tool is held by a non-dominant hand of the operator.
39. The apparatus according to any one of claims 33-37, further comprising another controlcomponent tool, wherein the control-component tool is held by a first hand of the operator, and wherein the processor is configured to increase the resistance while a second hand of the operator moves the other control-component tool.
40. The apparatus according to any one of claims 33-37, further comprising: a plurality of links coupled to each other via one or more joints and coupled to the controlcomponent tool; and respective motors operatively coupled to the joints, wherein the processor is configured to increase the resistance by increasing a counterforce applied to the joints by the motors.
41. An apparatus for performing a robotic surgical procedure on a portion of a body of a subject using a robotic unit holding a surgical tool, the apparatus comprising: a control-component tool; and a processor, configured to: drive the robotic unit to move the surgical tool correspondingly to movement of the control-component tool by an operator, receive an input, from the operator, indicating that at least one type of movement of the surgical tool should be limited, without other types of movement of the surgical tool being limited, and in response to the input, limit the movement of the control-component tool with respect to the type of movement but not with respect to the other types of movement.
42. The apparatus according to claim 41, further comprising an input interface, wherein the operator provides the input by pressing the input interface.
43. The apparatus according to claim 42, wherein the input interface is selected from the group of input interfaces consisting of: a button, and a foot pedal.
44. The apparatus according to claim 42, wherein the input interface comprises a button on the control-component tool.
45. The apparatus according to claim 41, wherein the surgical tool includes an ophthalmic surgical tool.
46. The apparatus according to claim 41, further comprising: a plurality of links coupled to each other via one or more joints and coupled to the control-component tool; and respective motors operatively coupled to the joints, wherein the processor is configured to limit the movement by increasing a counterforce applied to the joints by the motors.
47. The apparatus according to claim 41, wherein the processor is configured to limit the movement by disallowing the movement with respect to the type of movement.
48. The apparatus according to any one of claims 41-47, wherein, prior to the limiting of the movement, the control-component tool has multiple degrees of freedom, and wherein the processor is configured to limit the movement of the control-component tool with respect to at least one of the degrees of freedom.
49. The apparatus according to claim 48, wherein, prior to the limiting of the movement, the control-component tool has six degrees of freedom.
50. The apparatus according to any one of claims 41-47, wherein the processor is configured to limit translational movement of the control-component tool in at least one direction without limiting any rotational movement of the control-component tool.
51. The apparatus according to claim 50, wherein the direction is defined with respect to an orientation of the control-component tool.
52. The apparatus according to claim 51, wherein the direction is along a longitudinal axis of the control-component tool.
53. The apparatus according to claim 50, wherein the processor is configured to limit all translational movement.
54. The apparatus according to any one of claims 41-47, wherein the processor is configured to inhibit the surgical tool from moving deeper into the portion of the body of the subject by limiting the movement.
55. The apparatus according to claim 54, wherein the portion of the body includes an eye of the subject.
56. The apparatus according to any one of claims 41-47, wherein the processor is configured to inhibit the surgical tool from moving into an edge of an incision in the body of the subject by limiting the movement.
57. The apparatus according to claim 56, wherein the incision includes a corneal incision.
PCT/IB2025/052943 2024-03-21 2025-03-20 Controlling a surgical tool for performing microsurgical procedures in a robotic manner Pending WO2025196696A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202463568216P 2024-03-21 2024-03-21
US63/568,216 2024-03-21
US202463698118P 2024-09-24 2024-09-24
US63/698,118 2024-09-24

Publications (1)

Publication Number Publication Date
WO2025196696A1 true WO2025196696A1 (en) 2025-09-25

Family

ID=95250850

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2025/052943 Pending WO2025196696A1 (en) 2024-03-21 2025-03-20 Controlling a surgical tool for performing microsurgical procedures in a robotic manner

Country Status (1)

Country Link
WO (1) WO2025196696A1 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040034282A1 (en) * 2002-03-06 2004-02-19 Quaid Arthur E. System and method for using a haptic device as an input device
US20100234857A1 (en) * 1998-11-20 2010-09-16 Intuitve Surgical Operations, Inc. Medical robotic system with operatively couplable simulator unit for surgeon training
US20170042730A1 (en) * 2015-08-14 2017-02-16 The Johns Hopkins University Surgical system providing hands-free control of a surgical tool
US20200146885A1 (en) * 2017-05-09 2020-05-14 Sony Corporation Image processing device, image processing method, and image processing program
US20200214777A1 (en) * 2015-03-17 2020-07-09 Intuitive Surgical Operations, Inc. Systems and methods for onscreen identification of instruments in a teleoperational medical system
US20210000491A1 (en) * 2017-01-18 2021-01-07 Kb Medical, Sa Universal instrument guide for robotic surgical systems, surgical instrument systems, and methods of their use
US20210132701A1 (en) * 2018-01-18 2021-05-06 Intuitive Surgical Operations, Inc. System and method for assisting operator engagement with input devices
EP3470040B1 (en) * 2005-02-22 2022-03-16 Mako Surgical Corp. Haptic guidance system and method
US20230240779A1 (en) 2021-12-02 2023-08-03 Forsight Robotics Ltd. Force feedback for robotic microsurgical procedures
EP3658057B1 (en) * 2017-07-27 2023-08-30 Intuitive Surgical Operations, Inc. Association systems for manipulators

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100234857A1 (en) * 1998-11-20 2010-09-16 Intuitve Surgical Operations, Inc. Medical robotic system with operatively couplable simulator unit for surgeon training
US20040034282A1 (en) * 2002-03-06 2004-02-19 Quaid Arthur E. System and method for using a haptic device as an input device
EP3470040B1 (en) * 2005-02-22 2022-03-16 Mako Surgical Corp. Haptic guidance system and method
US20200214777A1 (en) * 2015-03-17 2020-07-09 Intuitive Surgical Operations, Inc. Systems and methods for onscreen identification of instruments in a teleoperational medical system
US20170042730A1 (en) * 2015-08-14 2017-02-16 The Johns Hopkins University Surgical system providing hands-free control of a surgical tool
US20210000491A1 (en) * 2017-01-18 2021-01-07 Kb Medical, Sa Universal instrument guide for robotic surgical systems, surgical instrument systems, and methods of their use
US20200146885A1 (en) * 2017-05-09 2020-05-14 Sony Corporation Image processing device, image processing method, and image processing program
EP3658057B1 (en) * 2017-07-27 2023-08-30 Intuitive Surgical Operations, Inc. Association systems for manipulators
US20210132701A1 (en) * 2018-01-18 2021-05-06 Intuitive Surgical Operations, Inc. System and method for assisting operator engagement with input devices
US20230240779A1 (en) 2021-12-02 2023-08-03 Forsight Robotics Ltd. Force feedback for robotic microsurgical procedures

Similar Documents

Publication Publication Date Title
EP4087516B1 (en) Robotic system for microsurgical procedures
Charreyron et al. A magnetically navigated microcannula for subretinal injections
WO2024201236A1 (en) Engagement of microsurgical robotic system
US20100331858A1 (en) Systems, devices, and methods for robot-assisted micro-surgical stenting
US20230240779A1 (en) Force feedback for robotic microsurgical procedures
Molaei et al. Toward the art of robotic-assisted vitreoretinal surgery
US20250017676A1 (en) Robotic unit for microsurgical procedures
US20230233204A1 (en) Kinematic structures for robotic microsurgical procedures
US20240307132A1 (en) Virtual tools for microsurgical procedures
US20230240773A1 (en) One-sided robotic surgical procedure
ES3028586T3 (en) Medical device for eye surgery
WO2024231879A1 (en) Input-receiving component for robotic microsurgical procedures
JP2025534344A (en) Robotic capsulotomy
WO2024176143A1 (en) Control component for robotic microsurgical procedures
WO2025196696A1 (en) Controlling a surgical tool for performing microsurgical procedures in a robotic manner
Jeganathan et al. Robotic technology in ophthalmic surgery
US20250072986A1 (en) Orienting image for robotic surgery
CN118574585A (en) Force feedback for robotic microsurgery
WO2025229483A1 (en) Augmented images for facilitating ophthalmic surgery
CN117412723A (en) Kinematic structure and sterile drape for robotic microsurgery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25716189

Country of ref document: EP

Kind code of ref document: A1