WO2025229483A1 - Augmented images for facilitating ophthalmic surgery - Google Patents
Augmented images for facilitating ophthalmic surgeryInfo
- Publication number
- WO2025229483A1 WO2025229483A1 PCT/IB2025/054381 IB2025054381W WO2025229483A1 WO 2025229483 A1 WO2025229483 A1 WO 2025229483A1 IB 2025054381 W IB2025054381 W IB 2025054381W WO 2025229483 A1 WO2025229483 A1 WO 2025229483A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eye
- processor
- image
- incision
- tool
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting in contact-lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting in contact-lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/007—Methods or devices for eye surgery
- A61F9/00736—Instruments for removal of intra-ocular material or intra-ocular injection, e.g. cataract instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2059—Mechanical position encoders
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/373—Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
- A61B2090/3735—Optical coherence tomography [OCT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/20—Surgical microscopes characterised by non-optical aspects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
Definitions
- the present invention is related generally to computer-facilitated surgery, and specifically to computer-facilitated ophthalmic surgery.
- Cataract surgery involves the removal of the natural lens of the eye that has developed an opacification (known as a cataract), and its replacement with an intraocular lens. Such surgery typically involves a number of standard steps, which are performed sequentially.
- the patient's face around the eye is disinfected (typically, with iodine solution), and the face is covered by a sterile drape, such that only the eye is exposed.
- a sterile drape such that only the eye is exposed.
- the eye is anesthetized, typically using a local anesthetic, which is administered in the form of liquid eye drops.
- the eyeball is then exposed, using an eyelid speculum that holds the upper and lower eyelids open.
- One or more (e.g., 2-3) incisions, typically including at least one larger incision having a three-planar form, are made in the cornea of the eye.
- the incisions are typically made using a specialized blade, which is called a keratome blade.
- anesthetic such as lidocaine
- another anesthetic is injected into the anterior chamber of the eye via the corneal incisions.
- the pupil is dilated, and a viscoelastic injection is applied via the corneal incisions.
- the viscoelastic injection is performed in order to stabilize the anterior chamber and to help maintain eye pressure during the remainder of the procedure, and also in order to distend the lens capsule.
- capsulorhexis In a subsequent stage, known as capsulorhexis, a part of the anterior lens capsule is removed, using one or more tools inserted via the corneal incisions.
- Various enhanced techniques have been developed for performing capsulorhexis, such as laser-assisted capsulorhexis, zepto-rhexis (which utilizes precision nano-pulse technology), and marker- assisted capsulorhexis (in which the cornea is marked using a predefined marker, in order to indicate the desired size for the capsule opening).
- a fluid wave to be injected via the corneal incisions, in order to dissect the cataract's outer cortical layer, in a step known as hydrodissection.
- a subsequent step known as hydrodelineation
- the outer softer epi-nucleus of the lens is separated from the inner firmer endo-nucleus by the injection of a fluid wave.
- ultrasonic emulsification of the lens is performed, in a process known as phacoemulsification.
- the nucleus of the lens is broken initially using a chopper, following which the outer fragments of the lens are broken and removed, typically using an ultrasonic phacoemulsification probe.
- the remaining lens cortex i.e., the outer layer of the lens
- viscoelastic material is aspirated from the capsule.
- aspirated fluids are typically replaced with irrigation of a balanced salt solution, in order to maintain fluid pressure in the anterior chamber.
- the capsule is polished. Subsequently, the intraocular lens (IOL) is inserted into the capsule.
- the IOL is typically foldable and is inserted in a folded configuration, before unfolding inside the capsule. If necessary, one or more of the incisions are sealed by elevating the pressure inside the bulbus oculi (i.e., the globe of the eye), causing the internal tissue to be pressed against the external tissue of the incisions, such as to force closed the incisions.
- Some embodiments of the present invention provide a computer processor configured to augment an image of an eye during a surgical procedure on the eye. Such embodiments are applicable, for example, to a robotic surgical procedure in which a robotic arm manipulates one or more tools under control of an operator, in that the augmentation of the image helps the operator manipulate the tools more safely and/or effectively.
- the processor is configured to calculate, for at least one point or range of points on a cornea of the eye, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points.
- the processor is further configured to demarcate the range of the tool, e.g., with a three-dimensional bounding region, in an image of the eye. In some embodiments, this augmentation is performed prior to an incision being made in the cornea, so as to help the operator place the incision.
- the processor is configured to track a cutting cool and the eye as the cutting tool is used to make an incision in the eye (e.g., in the cornea or lens capsule of the eye), and to the mark the incision in an image, such as an optical coherence tomography image, of the eye.
- the processor is further configured to compute at least one measure of deviation between the incision and a planned incision, based on respective shapes and/or positions of the incision and the planned incision, and to display an output indicating the measure of deviation.
- the processor is configured to track a phacoemulsifier as the phacoemulsifier phacoemulsifies a lens of the eye. Based on the tracking, the processor indicates, in an image of the eye, locations in the eye through which the phacoemulsifier passed, e.g., by demarcating a range of at least some of the locations and/or by displaying a curve that passes through at least some of the locations. In some embodiments, the processor is further configured to identify, by processing the image, one or more detached portions of the lens that were not yet aspirated by the phacoemulsifier, and to mark the portions in the image.
- the processor facilitates the implantation of intraocular lens, which includes an alignment marker, in an astigmatic eye, by marking, in an image of the eye, an astigmatism axis of the eye and multiple angles around the intraocular lens, thereby indicating how much the intraocular lens needs to be rotated in order for the alignment marker to be aligned with the astigmatism axis.
- the processor is configured to help prevent damage to sensitive portions of the eye by augmenting an image of the eye and/or by interfering with the control of the robotic arm.
- the processor is configured to track movement of a tool manipulated by the robotic arm, to ascertain, based on the tracking, that the tool is within a predefined threshold distance of a sensitive portion of the eye, and to inhibit the operator from moving the tool closer to the sensitive portion of the eye, in response to the ascertaining.
- the processor is configured to inhibit the operator by providing force feedback to the operator via the control -component unit used by the operator to control the robotic arm, and/or by disregarding a command from the operator to move the tool closer to the sensitive portion of the eye.
- the processor displays a warning to the operator, e.g., by displaying a halo around the cornea of the eye in an image of the eye.
- a system including a display and a processor.
- the processor is configured to calculate, during a surgical procedure on an eye, for at least one point or range of points on a cornea of the eye, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points.
- the processor is further configured to demarcate the range of the tool in an image of the eye, and to display the image on the display.
- the surgical procedure is a robotic surgical procedure in which a robotic arm manipulates the tool.
- the processor is configured to demarcate the range of the tool with a three-dimensional bounding region.
- the processor is further configured to mark the point or the range of points in the image.
- the processor is configured to demarcate the range of the tool prior to an incision being made at the point or along the range of points.
- the processor is configured to calculate and demarcate the range of the tool in response to a tip of a cutting tool for making the incision pointing toward the point.
- the processor is configured to calculate and demarcate the range of the tool in response to the tip being within a predefined distance of the cornea.
- the processor is configured to demarcate the range of the tool after an incision is made at the point or along the range of points.
- a method including, during a surgical procedure on an eye, for at least one point or range of points on a cornea of the eye, calculating, by a processor, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points.
- the method further includes demarcating the range of the tool, by the processor, in an image of the eye, and displaying the image, by the processor, on a display.
- a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored.
- the instructions when read by a processor, cause the processor to calculate, during a surgical procedure on an eye, for at least one point or range of points on a cornea of the eye, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points.
- the instructions further cause the processor to demarcate the range of the tool in an image of the eye, and to display the image on a display.
- a system including a display and a processor.
- the processor is configured to identify, during a surgical procedure on an eye in which a cutting tool is used to make an incision in the eye, a location of the incision by tracking the tool and the eye.
- the processor is further configured to mark the incision in at least one image of the eye, and to display the image on the display.
- the surgical procedure is a robotic surgical procedure in which a robotic arm manipulates the cutting tool.
- the image includes an optical coherence tomography image.
- the incision is in a lens capsule of the eye.
- the processor is further configured to: compute at least one measure of deviation between the incision and a planned incision, based on respective shapes and/or positions of the incision and the planned incision, and display an output indicating the measure of deviation.
- the incision is in a cornea of the eye.
- the processor is further configured to mark, in the image, a point along the incision at which another tool is inserted into the eye.
- the processor is further configured to: compare a shape of the incision to a canonical shape, and based on the comparing, display an output indicating a deviation of the shape from the canonical shape.
- the canonical shape is that of a three-planar incision.
- a method including, during a surgical procedure on an eye in which a cutting tool is used to make an incision in the eye, identifying a location of the incision, by a processor, by tracking the tool and the eye. The method further includes marking the incision, by the processor, in at least one image of the eye, and displaying the image, by the processor, on a display.
- a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored.
- the instructions when read by a processor, cause the processor to identify, during a surgical procedure on an eye in which a cutting tool is used to make an incision in the eye, a location of the incision by tracking the tool and the eye.
- the instructions further cause the processor to mark the incision in at least one image of the eye, and to display the image on a display.
- a system including a display and a processor.
- the processor is configured to track, during a surgical procedure on an eye, a border of a pupil of the eye as the pupil dilates.
- the processor is further configured to augment an image of the eye with an indication of a size of the pupil, based on the tracking, and to display the image on the display.
- the surgical procedure is a robotic surgical procedure.
- the processor is configured to augment the image by marking the border of the pupil.
- the processor is configured to augment the image by displaying the size in association with the image.
- a method including, during a surgical procedure on an eye, tracking a border of a pupil of the eye, by a processor, as the pupil dilates.
- the method further includes, based on the tracking, augmenting an image of the eye, by the processor, with an indication of a size of the pupil, and displaying the image, by the processor, on a display.
- a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored.
- the instructions when read by a processor, cause the processor to track, during a surgical procedure on an eye, a border of a pupil of the eye as the pupil dilates.
- the instructions further cause the processor to augment an image of the eye with an indication of a size of the pupil, based on the tracking, and to display the image on a display.
- a system including a display and a processor.
- the processor is configured to track a phacoemulsifier as the phacoemulsifier phacoemulsifies a lens of an eye.
- the processor is further configured to indicate in an image of the eye, based on the tracking, locations in the eye through which the phacoemulsifier passed, and to display the image on the display.
- the phacoemulsifier is manipulated by a robotic arm.
- the processor is configured to indicate the locations by demarcating a range of at least some of the locations in the image.
- the processor is configured to demarcate the range with a three- dimensional bounding region.
- the processor is configured to indicate the locations by displaying a curve that passes through at least some of the locations.
- the processor is further configured to: identify, by processing the image, one or more detached portions of the lens that were not yet aspirated by the phacoemulsifier, and mark the portions in the image.
- a method including tracking a phacoemulsifier, by a processor, as the phacoemulsifier phacoemulsifies a lens of an eye.
- the method further includes, based on the tracking, indicating in an image of the eye, by the processor, locations in the eye through which the phacoemulsifier passed, and displaying the image, by the processor, on a display.
- a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored.
- the instructions when read by a processor, cause the processor to track a phacoemulsifier as the phacoemulsifier phacoemulsifies a lens of an eye.
- the instructions further cause the processor to indicate in an image of the eye, based on the tracking, locations in the eye through which the phacoemulsifier passed, and to display the image on a display.
- Example 1 A system, including: a display; and a processor, configured to: while an intraocular lens, which includes an alignment marker, is implanted in an astigmatic eye, mark, in an image of the eye, an astigmatism axis of the eye and multiple angles around the intraocular lens, thereby indicating how much the intraocular lens needs to be rotated in order for the alignment marker to be aligned with the astigmatism axis, and display the image on the display.
- an intraocular lens which includes an alignment marker
- Example 2 The system according to Example 1, wherein the intraocular lens is inserted by a robotic arm.
- Example 3 A method, including: while an intraocular lens, which includes an alignment marker, is implanted in an astigmatic eye, marking in an image of the eye, by a processor, an astigmatism axis of the eye and multiple angles around the intraocular lens, thereby indicating how much the intraocular lens needs to be rotated in order for the alignment marker to be aligned with the astigmatism axis; and displaying the image, by the processor, on a display.
- Example 4 The method according to Example 3, wherein the intraocular lens is inserted by a robotic arm.
- Example 5. A computer software product including a tangible non-transitory computer- readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to: while an intraocular lens, which includes an alignment marker, is implanted in an astigmatic eye, mark, in an image of the eye, an astigmatism axis of the eye and multiple angles around the intraocular lens, thereby indicating how much the intraocular lens needs to be rotated in order for the alignment marker to be aligned with the astigmatism axis, and display the image on a display.
- an intraocular lens which includes an alignment marker
- Example 6 The computer software product according to Example 5, wherein the intraocular lens is inserted by a robotic arm.
- Example 7 A system, including: a robotic arm, configured to manipulate a tool within an eye under control of an operator; and a processor, configured to: track movement of the tool during a robotic surgical procedure on the eye in which the robotic arm manipulates the tool within the eye under the control of the operator, ascertain, based on the tracking, that the tool is within a predefined threshold distance of a sensitive portion of the eye, and in response to the ascertaining, inhibit the operator from moving the tool closer to the sensitive portion of the eye.
- Example 8 The system according to Example 7, wherein the operator controls the robotic arm using a control-component unit, and wherein the processor is configured to inhibit the operator by providing force feedback to the operator via the control-component unit.
- Example 9 The system according to Example 7, wherein the processor is configured to inhibit the operator by disregarding a command from the operator to move the tool closer to the sensitive portion of the eye.
- Example 10 The system according to Example 7, wherein the operator controls the robotic arm by moving a control component, and wherein the processor is configured to inhibit the operator by preventing the control component from moving in a direction of the sensitive portion of the eye.
- Example 11 The system according to any one of Examples 7-10, wherein the processor is configured to inhibit the operator by displaying a warning to the operator.
- Example 12 The system according to Example 11, wherein the warning includes a halo around a cornea of the eye in an image of the eye.
- Example 13 A method, including: during a robotic surgical procedure on an eye in which a robotic arm manipulates a tool within the eye under control of an operator, tracking, by a processor, movement of the tool; based on the tracking, ascertaining, by the processor, that the tool is within a predefined threshold distance of a sensitive portion of the eye; and in response to the ascertaining, inhibiting the operator, by the processor, from moving the tool closer to the sensitive portion of the eye.
- Example 14 The method according to Example 13, wherein the operator controls the robotic arm using a control-component unit, and wherein inhibiting the operator includes inhibiting the operator by providing force feedback to the operator via the control-component unit.
- Example 15 The method according to Example 13, wherein inhibiting the operator includes inhibiting the operator by disregarding a command from the operator to move the tool closer to the sensitive portion of the eye.
- Example 16 The method according to Example 13, wherein the operator controls the robotic arm by moving a control component, and wherein inhibiting the operator includes inhibiting the operator by preventing the control component from moving in a direction of the sensitive portion of the eye.
- Example 17 The method according to any one of Examples 13-16, wherein inhibiting the operator includes inhibiting the operator by displaying a warning to the operator.
- Example 18 The method according to Example 17, wherein the warning includes a halo around a cornea of the eye in an image of the eye.
- Example 19 A computer software product including a tangible non-transitory computer- readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to: track movement of a tool during a robotic surgical procedure on an eye in which the robotic arm manipulates the tool within the eye under control of an operator, ascertain, based on the tracking, that the tool is within a predefined threshold distance of a sensitive portion of the eye, and in response to the ascertaining, inhibit the operator from moving the tool closer to the sensitive portion of the eye.
- Example 20 The computer software product according to Example 19, wherein the operator controls the robotic arm using a control-component unit, and wherein the instructions cause the processor to inhibit the operator by providing force feedback to the operator via the controlcomponent unit.
- Example 21 The computer software product according to Example 19, wherein the instructions cause the processor to inhibit the operator by disregarding a command from the operator to move the tool closer to the sensitive portion of the eye.
- Example 22 The computer software product according to Example 19, wherein the operator controls the robotic arm by moving a control component, and wherein the instructions cause the processor to inhibit the operator by preventing the control component from moving in a direction of the sensitive portion of the eye.
- Example 23 The computer software product according to any one of Examples 19-22, wherein the instructions cause the processor to inhibit the operator by displaying a warning to the operator.
- Example 24 The computer software product according to Example 23, wherein the warning includes a halo around a cornea of the eye in an image of the eye.
- Fig. 1 is a schematic illustration of a system for performing robotic surgery on an eye of a patient, in accordance with some embodiments of the present invention
- FIGS. 2A, 2B, and 2C are schematic illustrations of images augmented to show a range of a tool during a surgical procedure on an eye, in accordance with some embodiments of the present invention
- Fig. 3A is a schematic illustration of an optical coherence tomography image augmented to show an incision made in an eye during a surgical procedure on the eye, in accordance with some embodiments of the present invention
- Fig. 3B is a schematic illustration of automatic feedback with respect to an incision, in accordance with some embodiments of the present invention.
- Fig. 4 is a schematic illustration of an image augmented to show the size of a pupil of an eye, in accordance with some embodiments of the present invention
- Fig. 5 is a schematic illustration of an image augmented to show locations in the eye through which a phacoemulsifier passed, in accordance with some embodiments of the present invention
- Fig. 6 is a schematic illustration of an image augmented to facilitate the implantation of an intraocular lens in an astigmatic eye, in accordance with some embodiments of the present invention.
- Fig. 7 is a schematic illustration of an image augmented to show a warning, in accordance with some embodiments of the present invention.
- FIG. 1 is a schematic illustration of a system 10 for performing robotic surgery (e.g., cataract surgery) on an eye 42 of a patient 12, in accordance with some embodiments of the present invention.
- robotic surgery e.g., cataract surgery
- System 10 comprises one or more (e.g., two) robotic arms 20 configured to hold respective surgical tools 21.
- System 10 further comprises a control-component unit 26 and a processor 28 (i.e., a computer processor).
- An operator 25 such as a physician or another healthcare professional, provides inputs to processor 28 via control-component unit 26.
- the processor communicates corresponding outputs to robotic arms 20, thereby moving the robotic arms (and hence surgical tools 21) in accordance with the inputs.
- control-component unit 26 comprises one or more (e.g., two) control components 70, each of which corresponds to a different respective robotic arm 20.
- Each control component 70 comprises a control tool 71, which operator 25 holds and manipulates as if control tool 71 were the surgical tool 21 held by the corresponding robotic arm.
- control tool 71 is generically-shaped, such that the size and shape of the control tool do not exactly match the size and shape of the corresponding surgical tool; for example, the control tool may comprise a stylus.
- the processor causes the corresponding robotic arm 20 to manipulate the surgical tool 21 correspondingly to the manipulation of the control tool, such that the operator controls the surgical tool via the control tool.
- operator 25 may control the pose (i.e., the position and orientation) of each surgical tool by appropriate manipulation of the corresponding control tool.
- processor 28 may move each surgical tool such that the position of a predetermined portion (e.g., the distal tip) of the surgical tool tracks the position of a predetermined portion of the corresponding control tool, and the orientation of the surgical tool matches that of the control tool.
- the operator may control the operations performed by each surgical tool by appropriate manipulation of the corresponding control tool.
- the processor is configured to scale movement of each surgical tool, relative to movement of the corresponding control tool, by a scale factor.
- this scale factor is less than one, i.e., for every movement of 1 mm by the control tool, the processor moves the surgical tool by 1/X mm, where X > 1.
- each control component 70 comprises a control arm 30 coupled to control tool 71.
- Control arm 30 comprises multiple joints 32 providing multiple degrees of freedom for the movement of control tool 71, along with respective sensors for joints 32.
- the sensors are configured to communicate, to processor 28, signals indicative of any changes in the orientations of the joints (and thus, any movement of control tool 71), such that the processor may cause the corresponding robotic arm 20 to move surgical tool 21 in a corresponding manner.
- surgical tools 21 comprise a phacoemulsification tool, for which the operation mode and/or suction power can be controlled.
- surgical tools 21 comprise an injector tool such as a cannula, for which a fluid, such as a viscoelastic fluid or saline, can be selected for injection, and for which the flow rate of the selection can be controlled.
- surgical tools 21 comprise an intraocular-lens-implanting tool, which is controlled so as to implant an intraocular lens in the eye, the implantation operation including inserting, moving for precise positioning, and/or rotating the intraocular lens.
- surgical tools 21 comprise a blade and/or forceps.
- System 10 further comprises an imaging system 22 comprising, for example, a wide- angle camera, a microscope camera, an optical coherence tomography imager, and/or any other suitable imaging device.
- System 10 further comprises one or more displays 24.
- at least one of the cameras belonging to imaging system 22 is stereoscopic, and at least one of displays 24 is a 3D display.
- Imaging system 22 images the surgical site and communicates the images to processor 28.
- the processor causes displays 24 to display the images.
- processor 28 is further configured to augment the displayed images with various indicia that facilitate the procedure.
- processor 28 causes at least one of the displays to display a real-time video (e.g., a real-time 3D video) of the surgical site.
- a real-time video e.g., a real-time 3D video
- imaging system 22 acquires images of the surgical site in rapid succession (e.g., at a frame rate of at least 60 frames per second), and processor 28 displays these images (e.g., with augmentation) in real-time on the display, e.g., such that the delay between the acquisition of each image and the display of the image is less than 80 ms.
- the images in the real-time video are acquired by a microscope camera, such that the surgical site is magnified in the real-time video.
- the processor tracks eye 42 by processing the images acquired by imaging system 22. For example, in some embodiments, the processor identifies the limbus of the eye in the images. In some embodiments, the processor also processes the images so as to track the pose of each surgical tool. For example, each tool may comprise a marker, and the processor may compute the pose based on the pose of the marker in the image. In other embodiments, the processor tracks the pose of each surgical tool using other techniques, such as electromagnetic tracking.
- control-component unit 26 comprises one or more other input interfaces (e.g., buttons).
- system 10 comprises one or more separate input interfaces (e.g., a foot pedal).
- these input interfaces can be used, by operator 25, to control the operations performed by each surgical tool.
- these input interfaces, and/or control components 70 can be used to control imaging system 22.
- the operator can adjust the x-y positioning of the imaging system, the zoom (i.e., the size of the field of view) of any of the imaging devices belonging to the imaging system, and/or the focus of any of the imaging devices belonging to the imaging system.
- processor 28 may be embodied as a single processor, or as a cooperatively networked or clustered set of processors.
- a first processor controls the movement of robotic arms 20, while a second processor performs the imageaugmentation functionality described herein.
- processor 28 may be implemented solely in hardware, e.g., using one or more fixed-function or general-purpose integrated circuits, Application-Specific Integrated Circuits (ASICs), and/or Field-Programmable Gate Arrays (FPGAs). Alternatively, this functionality may be implemented at least partly in software.
- processor 28 may be embodied as a programmed processor comprising, for example, a central processing unit (CPU) and/or a Graphics Processing Unit (GPU).
- Program code including software programs, and/or data may be loaded for execution and processing by the CPU and/or GPU.
- the program code and/or data may be downloaded to the processor in electronic form, over a network, for example.
- program code and/or data may be provided and/or stored on non-transitory tangible media, such as magnetic, optical, or electronic memory.
- program code and/or data when provided to the processor, produce a machine or special-purpose computer, configured to perform the tasks described herein.
- FIGs. 2A-C are schematic illustrations of images 34 augmented to show a range of a tool during the surgical procedure on eye 42, in accordance with some embodiments of the present invention.
- processor 28 is configured to calculate, for at least one point or range of points on the cornea of eye 42, a range of a tool within the eye, provided that the tool enters the eye via the point or via any point along the range of points.
- the processor is further configured to augment at least one image 34 of the eye, such as an overhead image of the eye acquired by a microscope camera belonging to imaging system 22 (Fig. 1), by demarcating the range of the tool in the image, and to display the image on display 24.
- the augmented image is a three-dimensional image, and the range of the tool is demarcated with a three-dimensional bounding region 38. (For ease of illustration, Figs. 2A-C show bounding region 38 in two dimensions.)
- the processor also marks the point or range of points in the image.
- the range of the tool is demarcated prior to an incision being made at the point or along the range of points, to help the operator decide whether to make the incision at that location.
- the processor calculates the range of the first tool that is to enter the eye, via the incision, following the making of the incision.
- the processor calculates the range of any other tool, such as a phacoemulsifier, that is to enter the eye via the incision.
- Fig. 2A depicts a scenario in which a cutting tool 36 is to be used to incise the cornea of the eye.
- the processor calculates and demarcates the range of another tool (e.g., a phacoemulsifier) for one or more candidate incision points 40, which are optionally marked in the image.
- the processor calculates and demarcates the range of the other tool in response to the tip of the cutting tool pointing toward point 40 and, optionally, being within a predefined distance (e.g., 2 mm) of the cornea.
- the range of the tool is demarcated after an incision is made at the point or along the range of points, typically as the tool approaches the incision and/or after the tool has entered the eye via the incision.
- the processor identifies the location of the incision by tracking cutting tool 36 (i.e., tracking the pose of the cutting tool) and eye 42, as described above with reference to Fig. 1, as the cutting tool makes the incision.
- the processor marks the incision in the image.
- the demarcation of the range of the tool is done for a single point along the incision, such as the point via which the tool enters the eye, or for multiple discrete points along the incision.
- an incision 44 which has been made in the cornea, is marked in image 34.
- Incision 44 spans a range of points at which a cannula 46 can enter the eye.
- the processor demarcates the range of the cannula for two points along the incision, each point being at or near a respective end of the incision.
- the range of the tool is demarcated assuming the tool can enter the eye via any point along the incision, such that the demarcated range is a union of the respective ranges of all points along the incision.
- the operator prior to an incision being made, the operator defines a candidate incision that spans a range of points.
- the processor marks the candidate incision in the image.
- the processor calculates and demarcates the range of the tool for one or more discrete points along the candidate incision, as in Fig. 2B, or for the entire incision, as in Fig. 2C.
- FIG. 3A is a schematic illustration of an optical coherence tomography image 48 augmented to show an incision 44 made in eye 42 during the surgical procedure on the eye, in accordance with some embodiments of the present invention.
- the processor is configured to mark at least one corneal incision 44 in image 34, which in some embodiments includes an overhead image of the eye acquired by a microscope camera.
- the processor is configured to mark incision 44 in optical coherence tomography image 48, which typically shows a side view of a portion of the eye from a vantage point along a central axis 50 of the eye.
- the processor overlays image 48 over a corner of image 34.
- the processor is further configured to mark, in image 34 and/or optical coherence tomography image 48, the point along the incision at which a tool, such as cannula 46, is inserted into the eye.
- the processor may display a circle 52 around this point or a line next to this point.
- the point of insertion functions as a virtual pivot point, in that rotational movements of the corresponding control tool 71 (Fig. 1) cause the tool to pivot with respect to this point, as described in co-assigned US Patent Application Publication 2023/0240779 to Golan, whose disclosure is incorporated herein by reference. (The aforementioned reference uses the term “remote center of motion” in place of “virtual pivot point.”)
- the marking of the insertion point helps the operator control the tool.
- Fig. 3B is a schematic illustration of automatic feedback with respect to incision 44, in accordance with some embodiments of the present invention.
- the processor in addition to marking incision 44, is configured to compare the shape of the incision to a canonical shape 54, such as that of a three-planar incision, e.g., by performing a cross-correlation of an image of the incision with the canonical shape. Based on the comparison, the processor displays an output indicating the deviation of the shape of incision 44 from canonical shape 54, thus helping the operator improve his or her incision-making skills.
- the processor marks an incision, such as a capsulorhexis incision, in the lens capsule of the eye, e.g., as shown in Fig. 6.
- the processor identifies the location of the incision by tracking the cutting tool (which may comprise, for example, forceps, a laser-assisted capsulorhexis tool, or a zepto-rhexis tool) and eye 42, as described above with reference to Fig. 1, as the cutting tool makes the incision.
- the processor identifies the incision by processing images of the eye, e.g., using an edge-detection algorithm.
- the processor computes at least one measure of deviation between the incision and the planned incision, based on the respective shapes and/or positions of the incision and the planned incision. The processor then displays an output indicating the measure of deviation.
- Fig. 4 is a schematic illustration of image 34 augmented to show the size of a pupil 56 of eye 42, in accordance with some embodiments of the present invention.
- the processor tracks the border 58 of the pupil, e.g., using an edge-detection algorithm. Based on the tracking, the processor augments image 34 with an indication of the size of the pupil. For example, in some embodiments, the processor marks border 58. Alternatively or additionally, the processor displays the size (e.g., the pupil diameter) in association with the image, e.g., over a portion of the image near the pupil.
- Fig. 5 is a schematic illustration of image 34 augmented to show locations in the eye through which a phacoemulsifier 60 passed, in accordance with some embodiments of the present invention.
- the processor tracks phacoemulsifier 60, as described above with reference to Fig. 1, as the phacoemulsifier phacoemulsifies the lens of eye 42. Based on the tracking, the processor indicates, in image 34, the locations in the eye through which the phacoemulsifier passed, thereby facilitating a more efficient phacoemulsification. For example, in some embodiments, the processor demarcates the range of at least some of the locations in the image, e.g., with a three-dimensional bounding region 64. (For ease of illustration, bounding region 64 is shown in two dimensions.) Alternatively or additionally, the processor displays a curve 66 that passes through at least some of the locations.
- the processor further identifies, by processing the image (e.g., using a segmentation algorithm), one or more detached portions 68 of the lens that were not yet aspirated by phacoemulsifier 60, and marks portions 68 in the image.
- Fig. 5 also shows a chopper 62, which facilitates the phacoemulsification.
- FIG. 6 is a schematic illustration of image 34 augmented to facilitate the implantation of an intraocular lens 72 in an astigmatic eye 42a, in accordance with some embodiments of the present invention.
- the processor marks, in image 34, the astigmatism axis 74 of the eye, which is typically defined by the operator prior to the procedure.
- the processor marks astigmatism axis 74 with a line, which is centered between two additional parallel lines.
- the processor also marks the visual axis 76 of the eye, which goes into the page in Fig. 6.
- the processor also overlays a marker 78 over the planned capsulorhexis incision and/or actual capsulorhexis incision.
- intraocular lens 72 is typically inserted through the capsulorhexis incision in a folded configuration.
- Fig. 6 shows the intraocular lens after the subsequent unfolding of the lens.
- the operator needs to align the center of the lens with visual axis 76, and also needs to rotate the lens such that an alignment marker 80 on the lens is aligned with astigmatism axis 74.
- the processor marks multiple angles 82 around the intraocular lens (e.g., at the limbus of the eye), thereby indicating to the operator how much the intraocular lens needs to be rotated in order for alignment marker 80 to be aligned with the astigmatism axis.
- the required rotation is around 60 degrees in the clockwise direction.
- Fig. 7 is a schematic illustration of image 34 augmented to show a warning 84, in accordance with some embodiments of the present invention.
- a robotic arm 20 manipulates a tool, such as cannula 46, within eye 42 under the control of the operator
- a risk of the tool contacting a sensitive portion of the eye e.g., the retina
- the processor tracks the tool, as described above with reference to Fig. 1. If, based on the tracking, the processor ascertains that the tool is within a predefined threshold distance of a sensitive portion of the eye, the processor inhibits the operator from moving the tool closer to the sensitive portion of the eye.
- warning 84 the processor displays warning 84 to the operator on display 24.
- warning 84 includes an icon and/or one or more words.
- warning 84 includes a halo, which is typically brightly colored, displayed around the cornea of the eye.
- the processor provides force feedback to the operator via controlcomponent unit 26, e.g., using techniques for force feedback described in co-assigned US Patent Application Publication 2023/0240779 to Golan, whose disclosure is incorporated herein by reference.
- the processor computes a force function based on the distance of the tool from the sensitive portion of the eye and/or based on the direction in which the operator attempts to move the tool.
- the force function returns a force vector, which is provided by control component 70 (Fig. 1) to the operator.
- the processor disregards the command from the operator, and does not move the tool closer to the sensitive portion of the eye.
- the processor prevents the control component from moving in the direction of the sensitive portion of the eye.
- the processor displays warning 84 and/or begins to provide force feedback in response to the tool being within a first predefined threshold distance. If the operator nonetheless moves the tool to within a second, smaller predefined threshold distance and then attempts to move the tool even closer to the sensitive portion of the eye, the processor disregards the command from the operator or prevents the control component from moving in the direction of the sensitive portion of the eye.
- image-augmentation functionality described herein can also be performed during a manual surgical procedure in which the surgical tools are manipulated directly by operator 25 (Fig. 1), without the use of any robotic arms.
- processor 28 performs image analysis on one or more image acquired by imaging system 22, in order to extract parameters from the image (e.g., features of the eye and/or movement of one or more tools) that are necessary in order to generate the augmentation.
- the computer processor receives inputs from the robotic arms 20 and/or control components 70 that are indicative of movement of one or more tools relative to the patient’s eye.
- Embodiments of the invention described herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium (e.g., a non- transitory computer-readable medium) providing program code for use by or in connection with a computer or any instruction execution system, such as computer processor 28 (Fig. 1).
- a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- the medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium.
- the computer-usable or computer readable medium is a non-transitory computer-usable or computer readable medium.
- Examples of a computer-readable medium include a semiconductor or solid-state memory, magnetic tape, a removable computer diskette, a random-access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk.
- Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W), DVD, and a USB drive.
- a data processing system suitable for storing and/or executing program code will include at least one processor (e.g., computer processor 28) coupled directly or indirectly to memory elements through a system bus.
- the memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
- the system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments of the invention.
- Network adapters may be coupled to the processor to enable the processor to become coupled to other processors or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
- Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object- oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the C programming language or similar programming languages.
- object- oriented programming language such as Java, Smalltalk, C++ or the like
- conventional procedural programming languages such as the C programming language or similar programming languages.
- These computer program instructions may also be stored in a computer-readable medium (e.g., a non-transitory computer-readable medium) that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the algorithms.
- the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the algorithms described in the present application.
- Computer processor 28 is typically a hardware device programmed with computer program instructions to produce a special purpose computer. For example, when programmed to perform the algorithms described with reference to the figures, computer processor 28 typically acts as a special purpose image- augmentation computer processor. Typically, the operations described herein that are performed by computer processor 28 transform the physical state of a memory, which is a real physical article, to have a different magnetic polarity, electrical charge, or the like depending on the technology of the memory that is used. For some embodiments, operations that are described as being performed by a computer processor are performed by a plurality of computer processors in combination with each other.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Robotics (AREA)
- Vascular Medicine (AREA)
- Pathology (AREA)
- Gynecology & Obstetrics (AREA)
- Radiology & Medical Imaging (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Prostheses (AREA)
Abstract
Apparatus and methods are described including a display (24) and a processor (28). The processor (28) is configured during a surgical procedure on an eye, to calculate, for at least one point or range of points on a cornea of the eye, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points, to demarcate the range of the tool in an image of the eye, and to display the image on the display (24). Other applications are also described.
Description
AUGMENTED IMAGES FOR FACILITATING OPHTHALMIC SURGERY
CROSS-REFERENCE TO RELATED APPLICATIONS
The present application claims priority from US Provisional Application 63/639,644 to Levinson et al., entitled “Augmented images for facilitating ophthalmic surgery,” filed April 28, 2024, which is incorporated herein by reference.
FIELD OF EMBODIMENTS OF THE INVENTION
The present invention is related generally to computer-facilitated surgery, and specifically to computer-facilitated ophthalmic surgery.
BACKGROUND
Cataract surgery involves the removal of the natural lens of the eye that has developed an opacification (known as a cataract), and its replacement with an intraocular lens. Such surgery typically involves a number of standard steps, which are performed sequentially.
In an initial step, the patient's face around the eye is disinfected (typically, with iodine solution), and the face is covered by a sterile drape, such that only the eye is exposed. When the disinfection and draping has been completed, the eye is anesthetized, typically using a local anesthetic, which is administered in the form of liquid eye drops. The eyeball is then exposed, using an eyelid speculum that holds the upper and lower eyelids open. One or more (e.g., 2-3) incisions, typically including at least one larger incision having a three-planar form, are made in the cornea of the eye. The incisions are typically made using a specialized blade, which is called a keratome blade. Subsequently, another anesthetic, such as lidocaine, is injected into the anterior chamber of the eye via the corneal incisions. Following this step, the pupil is dilated, and a viscoelastic injection is applied via the corneal incisions. The viscoelastic injection is performed in order to stabilize the anterior chamber and to help maintain eye pressure during the remainder of the procedure, and also in order to distend the lens capsule.
In a subsequent stage, known as capsulorhexis, a part of the anterior lens capsule is removed, using one or more tools inserted via the corneal incisions. Various enhanced techniques have been developed for performing capsulorhexis, such as laser-assisted capsulorhexis, zepto-rhexis (which utilizes precision nano-pulse technology), and marker-
assisted capsulorhexis (in which the cornea is marked using a predefined marker, in order to indicate the desired size for the capsule opening).
Subsequently, it is common for a fluid wave to be injected via the corneal incisions, in order to dissect the cataract's outer cortical layer, in a step known as hydrodissection. In a subsequent step, known as hydrodelineation, the outer softer epi-nucleus of the lens is separated from the inner firmer endo-nucleus by the injection of a fluid wave. In the next step, ultrasonic emulsification of the lens is performed, in a process known as phacoemulsification. The nucleus of the lens is broken initially using a chopper, following which the outer fragments of the lens are broken and removed, typically using an ultrasonic phacoemulsification probe. When the phacoemulsification is complete, the remaining lens cortex (i.e., the outer layer of the lens) and viscoelastic material is aspirated from the capsule. During the phacoemulsification and the aspiration, aspirated fluids are typically replaced with irrigation of a balanced salt solution, in order to maintain fluid pressure in the anterior chamber.
In some cases, if deemed to be necessary, the capsule is polished. Subsequently, the intraocular lens (IOL) is inserted into the capsule. The IOL is typically foldable and is inserted in a folded configuration, before unfolding inside the capsule. If necessary, one or more of the incisions are sealed by elevating the pressure inside the bulbus oculi (i.e., the globe of the eye), causing the internal tissue to be pressed against the external tissue of the incisions, such as to force closed the incisions.
SUMMARY
Some embodiments of the present invention provide a computer processor configured to augment an image of an eye during a surgical procedure on the eye. Such embodiments are applicable, for example, to a robotic surgical procedure in which a robotic arm manipulates one or more tools under control of an operator, in that the augmentation of the image helps the operator manipulate the tools more safely and/or effectively.
For example, in some embodiments, the processor is configured to calculate, for at least one point or range of points on a cornea of the eye, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points. The processor is further configured to demarcate the range of the tool, e.g., with a three-dimensional bounding region, in an image of the eye. In some embodiments, this augmentation is performed prior to an incision being made in the cornea, so as to help the operator place the incision.
As another example, in some embodiments, the processor is configured to track a cutting cool and the eye as the cutting tool is used to make an incision in the eye (e.g., in the cornea or lens capsule of the eye), and to the mark the incision in an image, such as an optical coherence tomography image, of the eye. In some embodiments, the processor is further configured to compute at least one measure of deviation between the incision and a planned incision, based on respective shapes and/or positions of the incision and the planned incision, and to display an output indicating the measure of deviation.
As another example, in some embodiments, the processor is configured to track a phacoemulsifier as the phacoemulsifier phacoemulsifies a lens of the eye. Based on the tracking, the processor indicates, in an image of the eye, locations in the eye through which the phacoemulsifier passed, e.g., by demarcating a range of at least some of the locations and/or by displaying a curve that passes through at least some of the locations. In some embodiments, the processor is further configured to identify, by processing the image, one or more detached portions of the lens that were not yet aspirated by the phacoemulsifier, and to mark the portions in the image.
As yet another example, in some embodiments, the processor facilitates the implantation of intraocular lens, which includes an alignment marker, in an astigmatic eye, by marking, in an image of the eye, an astigmatism axis of the eye and multiple angles around the intraocular lens, thereby indicating how much the intraocular lens needs to be rotated in order for the alignment marker to be aligned with the astigmatism axis.
In some embodiments, during a robotic surgical procedure, the processor is configured to help prevent damage to sensitive portions of the eye by augmenting an image of the eye and/or by interfering with the control of the robotic arm. In particular, the processor is configured to track movement of a tool manipulated by the robotic arm, to ascertain, based on the tracking, that the tool is within a predefined threshold distance of a sensitive portion of the eye, and to inhibit the operator from moving the tool closer to the sensitive portion of the eye, in response to the ascertaining. For example, in some embodiments, the processor is configured to inhibit the operator by providing force feedback to the operator via the control -component unit used by the operator to control the robotic arm, and/or by disregarding a command from the operator to move the tool closer to the sensitive portion of the eye. Alternatively or additionally, the processor displays a warning to the operator, e.g., by displaying a halo around the cornea of the eye in an image of the eye.
There is therefore provided, in accordance with some embodiments of the present invention, a system including a display and a processor. The processor is configured to calculate, during a surgical procedure on an eye, for at least one point or range of points on a cornea of the eye, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points. The processor is further configured to demarcate the range of the tool in an image of the eye, and to display the image on the display.
In some embodiments, the surgical procedure is a robotic surgical procedure in which a robotic arm manipulates the tool.
In some embodiments, the processor is configured to demarcate the range of the tool with a three-dimensional bounding region.
In some embodiments, the processor is further configured to mark the point or the range of points in the image.
In some embodiments, the processor is configured to demarcate the range of the tool prior to an incision being made at the point or along the range of points.
In some embodiments, the processor is configured to calculate and demarcate the range of the tool in response to a tip of a cutting tool for making the incision pointing toward the point.
In some embodiments, the processor is configured to calculate and demarcate the range of the tool in response to the tip being within a predefined distance of the cornea.
In some embodiments, the processor is configured to demarcate the range of the tool after an incision is made at the point or along the range of points.
There is further provided, in accordance with some embodiments of the present invention, a method including, during a surgical procedure on an eye, for at least one point or range of points on a cornea of the eye, calculating, by a processor, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points. The method further includes demarcating the range of the tool, by the processor, in an image of the eye, and displaying the image, by the processor, on a display.
There is further provided, in accordance with some embodiments of the present invention, a computer software product including a tangible non-transitory computer-readable
medium in which program instructions are stored. The instructions, when read by a processor, cause the processor to calculate, during a surgical procedure on an eye, for at least one point or range of points on a cornea of the eye, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points. The instructions further cause the processor to demarcate the range of the tool in an image of the eye, and to display the image on a display.
There is further provided, in accordance with some embodiments of the present invention, a system including a display and a processor. The processor is configured to identify, during a surgical procedure on an eye in which a cutting tool is used to make an incision in the eye, a location of the incision by tracking the tool and the eye. The processor is further configured to mark the incision in at least one image of the eye, and to display the image on the display.
In some embodiments, the surgical procedure is a robotic surgical procedure in which a robotic arm manipulates the cutting tool.
In some embodiments, the image includes an optical coherence tomography image.
In some embodiments, the incision is in a lens capsule of the eye.
In some embodiments, the processor is further configured to: compute at least one measure of deviation between the incision and a planned incision, based on respective shapes and/or positions of the incision and the planned incision, and display an output indicating the measure of deviation.
In some embodiments, the incision is in a cornea of the eye.
In some embodiments, the processor is further configured to mark, in the image, a point along the incision at which another tool is inserted into the eye.
In some embodiments, the processor is further configured to: compare a shape of the incision to a canonical shape, and based on the comparing, display an output indicating a deviation of the shape from the canonical shape.
In some embodiments, the canonical shape is that of a three-planar incision.
There is further provided, in accordance with some embodiments of the present invention, a method including, during a surgical procedure on an eye in which a cutting tool is used to make an incision in the eye, identifying a location of the incision, by a processor, by tracking the tool and the eye. The method further includes marking the incision, by the processor, in at least one image of the eye, and displaying the image, by the processor, on a display.
There is further provided, in accordance with some embodiments of the present invention, a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored. The instructions, when read by a processor, cause the processor to identify, during a surgical procedure on an eye in which a cutting tool is used to make an incision in the eye, a location of the incision by tracking the tool and the eye. The instructions further cause the processor to mark the incision in at least one image of the eye, and to display the image on a display.
There is further provided, in accordance with some embodiments of the present invention, a system including a display and a processor. The processor is configured to track, during a surgical procedure on an eye, a border of a pupil of the eye as the pupil dilates. The processor is further configured to augment an image of the eye with an indication of a size of the pupil, based on the tracking, and to display the image on the display.
In some embodiments, the surgical procedure is a robotic surgical procedure.
In some embodiments, the processor is configured to augment the image by marking the border of the pupil.
In some embodiments, the processor is configured to augment the image by displaying the size in association with the image.
There is further provided, in accordance with some embodiments of the present invention, a method including, during a surgical procedure on an eye, tracking a border of a pupil of the eye, by a processor, as the pupil dilates. The method further includes, based on the tracking, augmenting an image of the eye, by the processor, with an indication of a size of the pupil, and displaying the image, by the processor, on a display.
There is further provided, in accordance with some embodiments of the present invention, a computer software product including a tangible non-transitory computer-readable medium in which program instructions are stored. The instructions, when read by a processor, cause the processor to track, during a surgical procedure on an eye, a border of a pupil of the eye as the pupil dilates. The instructions further cause the processor to augment an image of the eye with an indication of a size of the pupil, based on the tracking, and to display the image on a display.
There is further provided, in accordance with some embodiments of the present invention, a system including a display and a processor. The processor is configured to track a phacoemulsifier as the phacoemulsifier phacoemulsifies a lens of an eye. The processor is further configured to indicate in an image of the eye, based on the tracking, locations in the eye through which the phacoemulsifier passed, and to display the image on the display.
In some embodiments, the phacoemulsifier is manipulated by a robotic arm.
In some embodiments, the processor is configured to indicate the locations by demarcating a range of at least some of the locations in the image.
In some embodiments, the processor is configured to demarcate the range with a three- dimensional bounding region.
In some embodiments, the processor is configured to indicate the locations by displaying a curve that passes through at least some of the locations.
In some embodiments, the processor is further configured to: identify, by processing the image, one or more detached portions of the lens that were not yet aspirated by the phacoemulsifier, and mark the portions in the image.
There is further provided, in accordance with some embodiments of the present invention, a method including tracking a phacoemulsifier, by a processor, as the phacoemulsifier phacoemulsifies a lens of an eye. The method further includes, based on the tracking, indicating in an image of the eye, by the processor, locations in the eye through which the phacoemulsifier passed, and displaying the image, by the processor, on a display.
There is further provided, in accordance with some embodiments of the present invention, a computer software product including a tangible non-transitory computer-readable
medium in which program instructions are stored. The instructions, when read by a processor, cause the processor to track a phacoemulsifier as the phacoemulsifier phacoemulsifies a lens of an eye. The instructions further cause the processor to indicate in an image of the eye, based on the tracking, locations in the eye through which the phacoemulsifier passed, and to display the image on a display.
There is further provided, in accordance with some embodiments of the present disclosure, the following examples, at least some of which (e.g., Examples 1, 3, 5, 7, 13, and 19) are independent aspects of the present disclosure.
Example 1. A system, including: a display; and a processor, configured to: while an intraocular lens, which includes an alignment marker, is implanted in an astigmatic eye, mark, in an image of the eye, an astigmatism axis of the eye and multiple angles around the intraocular lens, thereby indicating how much the intraocular lens needs to be rotated in order for the alignment marker to be aligned with the astigmatism axis, and display the image on the display.
Example 2. The system according to Example 1, wherein the intraocular lens is inserted by a robotic arm.
Example 3. A method, including: while an intraocular lens, which includes an alignment marker, is implanted in an astigmatic eye, marking in an image of the eye, by a processor, an astigmatism axis of the eye and multiple angles around the intraocular lens, thereby indicating how much the intraocular lens needs to be rotated in order for the alignment marker to be aligned with the astigmatism axis; and displaying the image, by the processor, on a display.
Example 4. The method according to Example 3, wherein the intraocular lens is inserted by a robotic arm.
Example 5. A computer software product including a tangible non-transitory computer- readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to: while an intraocular lens, which includes an alignment marker, is implanted in an astigmatic eye, mark, in an image of the eye, an astigmatism axis of the eye and multiple angles around the intraocular lens, thereby indicating how much the intraocular lens needs to be rotated in order for the alignment marker to be aligned with the astigmatism axis, and display the image on a display.
Example 6. The computer software product according to Example 5, wherein the intraocular lens is inserted by a robotic arm.
Example 7. A system, including: a robotic arm, configured to manipulate a tool within an eye under control of an operator; and a processor, configured to: track movement of the tool during a robotic surgical procedure on the eye in which the robotic arm manipulates the tool within the eye under the control of the operator, ascertain, based on the tracking, that the tool is within a predefined threshold distance of a sensitive portion of the eye, and in response to the ascertaining, inhibit the operator from moving the tool closer to the sensitive portion of the eye.
Example 8. The system according to Example 7, wherein the operator controls the robotic arm using a control-component unit, and wherein the processor is configured to inhibit the operator by providing force feedback to the operator via the control-component unit.
Example 9. The system according to Example 7, wherein the processor is configured to inhibit the operator by disregarding a command from the operator to move the tool closer to the sensitive portion of the eye.
Example 10. The system according to Example 7, wherein the operator controls the robotic arm by moving a control component, and wherein the processor is configured to inhibit the operator by preventing the control component from moving in a direction of the sensitive portion of the eye.
Example 11. The system according to any one of Examples 7-10, wherein the processor is configured to inhibit the operator by displaying a warning to the operator.
Example 12. The system according to Example 11, wherein the warning includes a halo around a cornea of the eye in an image of the eye.
Example 13. A method, including: during a robotic surgical procedure on an eye in which a robotic arm manipulates a tool within the eye under control of an operator, tracking, by a processor, movement of the tool; based on the tracking, ascertaining, by the processor, that the tool is within a predefined threshold distance of a sensitive portion of the eye; and in response to the ascertaining, inhibiting the operator, by the processor, from moving the tool closer to the sensitive portion of the eye.
Example 14. The method according to Example 13, wherein the operator controls the robotic arm using a control-component unit, and wherein inhibiting the operator includes inhibiting the operator by providing force feedback to the operator via the control-component unit.
Example 15. The method according to Example 13, wherein inhibiting the operator includes inhibiting the operator by disregarding a command from the operator to move the tool closer to the sensitive portion of the eye.
Example 16. The method according to Example 13, wherein the operator controls the robotic arm by moving a control component, and wherein inhibiting the operator includes inhibiting the operator by preventing the control component from moving in a direction of the sensitive portion of the eye.
Example 17. The method according to any one of Examples 13-16, wherein inhibiting the operator includes inhibiting the operator by displaying a warning to the operator.
Example 18. The method according to Example 17, wherein the warning includes a halo around a cornea of the eye in an image of the eye.
Example 19. A computer software product including a tangible non-transitory computer- readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to: track movement of a tool during a robotic surgical procedure on an eye in which the robotic arm manipulates the tool within the eye under control of an operator,
ascertain, based on the tracking, that the tool is within a predefined threshold distance of a sensitive portion of the eye, and in response to the ascertaining, inhibit the operator from moving the tool closer to the sensitive portion of the eye.
Example 20. The computer software product according to Example 19, wherein the operator controls the robotic arm using a control-component unit, and wherein the instructions cause the processor to inhibit the operator by providing force feedback to the operator via the controlcomponent unit.
Example 21. The computer software product according to Example 19, wherein the instructions cause the processor to inhibit the operator by disregarding a command from the operator to move the tool closer to the sensitive portion of the eye.
Example 22. The computer software product according to Example 19, wherein the operator controls the robotic arm by moving a control component, and wherein the instructions cause the processor to inhibit the operator by preventing the control component from moving in a direction of the sensitive portion of the eye.
Example 23. The computer software product according to any one of Examples 19-22, wherein the instructions cause the processor to inhibit the operator by displaying a warning to the operator.
Example 24. The computer software product according to Example 23, wherein the warning includes a halo around a cornea of the eye in an image of the eye.
The present invention will be more fully understood from the following detailed description of embodiments thereof, taken together with the drawings, in which:
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a schematic illustration of a system for performing robotic surgery on an eye of a patient, in accordance with some embodiments of the present invention;
Figs. 2A, 2B, and 2C are schematic illustrations of images augmented to show a range of a tool during a surgical procedure on an eye, in accordance with some embodiments of the present invention;
Fig. 3A is a schematic illustration of an optical coherence tomography image augmented
to show an incision made in an eye during a surgical procedure on the eye, in accordance with some embodiments of the present invention;
Fig. 3B is a schematic illustration of automatic feedback with respect to an incision, in accordance with some embodiments of the present invention;
Fig. 4 is a schematic illustration of an image augmented to show the size of a pupil of an eye, in accordance with some embodiments of the present invention;
Fig. 5 is a schematic illustration of an image augmented to show locations in the eye through which a phacoemulsifier passed, in accordance with some embodiments of the present invention;
Fig. 6 is a schematic illustration of an image augmented to facilitate the implantation of an intraocular lens in an astigmatic eye, in accordance with some embodiments of the present invention; and
Fig. 7 is a schematic illustration of an image augmented to show a warning, in accordance with some embodiments of the present invention.
DETAILED DESCRIPTION
SYSTEM DESCRIPTION
Reference is initially made to Fig. 1, which is a schematic illustration of a system 10 for performing robotic surgery (e.g., cataract surgery) on an eye 42 of a patient 12, in accordance with some embodiments of the present invention.
System 10 comprises one or more (e.g., two) robotic arms 20 configured to hold respective surgical tools 21. System 10 further comprises a control-component unit 26 and a processor 28 (i.e., a computer processor). An operator 25, such as a physician or another healthcare professional, provides inputs to processor 28 via control-component unit 26. In response to these inputs, the processor communicates corresponding outputs to robotic arms 20, thereby moving the robotic arms (and hence surgical tools 21) in accordance with the inputs.
In particular, control-component unit 26 comprises one or more (e.g., two) control components 70, each of which corresponds to a different respective robotic arm 20. Each control component 70 comprises a control tool 71, which operator 25 holds and manipulates as if control tool 71 were the surgical tool 21 held by the corresponding robotic arm. (Typically,
control tool 71 is generically-shaped, such that the size and shape of the control tool do not exactly match the size and shape of the corresponding surgical tool; for example, the control tool may comprise a stylus.) In response to the manipulation of each control tool, the processor causes the corresponding robotic arm 20 to manipulate the surgical tool 21 correspondingly to the manipulation of the control tool, such that the operator controls the surgical tool via the control tool.
For example, operator 25 may control the pose (i.e., the position and orientation) of each surgical tool by appropriate manipulation of the corresponding control tool. In particular, processor 28 may move each surgical tool such that the position of a predetermined portion (e.g., the distal tip) of the surgical tool tracks the position of a predetermined portion of the corresponding control tool, and the orientation of the surgical tool matches that of the control tool. In addition, the operator may control the operations performed by each surgical tool by appropriate manipulation of the corresponding control tool.
In some embodiments, the processor is configured to scale movement of each surgical tool, relative to movement of the corresponding control tool, by a scale factor. Typically, this scale factor is less than one, i.e., for every movement of 1 mm by the control tool, the processor moves the surgical tool by 1/X mm, where X > 1.
Typically, each control component 70 comprises a control arm 30 coupled to control tool 71. Control arm 30 comprises multiple joints 32 providing multiple degrees of freedom for the movement of control tool 71, along with respective sensors for joints 32. The sensors are configured to communicate, to processor 28, signals indicative of any changes in the orientations of the joints (and thus, any movement of control tool 71), such that the processor may cause the corresponding robotic arm 20 to move surgical tool 21 in a corresponding manner.
In some embodiments, surgical tools 21 comprise a phacoemulsification tool, for which the operation mode and/or suction power can be controlled. Alternatively or additionally, surgical tools 21 comprise an injector tool such as a cannula, for which a fluid, such as a viscoelastic fluid or saline, can be selected for injection, and for which the flow rate of the selection can be controlled. Alternatively or additionally, surgical tools 21 comprise an intraocular-lens-implanting tool, which is controlled so as to implant an intraocular lens in the eye, the implantation operation including inserting, moving for precise positioning, and/or rotating the intraocular lens. Alternatively or additionally, surgical tools 21 comprise a blade
and/or forceps.
System 10 further comprises an imaging system 22 comprising, for example, a wide- angle camera, a microscope camera, an optical coherence tomography imager, and/or any other suitable imaging device. System 10 further comprises one or more displays 24. In some embodiments, at least one of the cameras belonging to imaging system 22 is stereoscopic, and at least one of displays 24 is a 3D display.
Imaging system 22 images the surgical site and communicates the images to processor 28. In response to receiving the images, the processor causes displays 24 to display the images. As described below with reference to the subsequent figures, processor 28 is further configured to augment the displayed images with various indicia that facilitate the procedure.
Typically, due to the need for real-time visual feedback while performing the procedure, processor 28 causes at least one of the displays to display a real-time video (e.g., a real-time 3D video) of the surgical site. In other words, imaging system 22 acquires images of the surgical site in rapid succession (e.g., at a frame rate of at least 60 frames per second), and processor 28 displays these images (e.g., with augmentation) in real-time on the display, e.g., such that the delay between the acquisition of each image and the display of the image is less than 80 ms. (In general, when referring to the frames of a video, the present description may use the terms “image” and “frame” interchangeably.) Typically, the images in the real-time video are acquired by a microscope camera, such that the surgical site is magnified in the real-time video.
Typically, during the procedure, the processor tracks eye 42 by processing the images acquired by imaging system 22. For example, in some embodiments, the processor identifies the limbus of the eye in the images. In some embodiments, the processor also processes the images so as to track the pose of each surgical tool. For example, each tool may comprise a marker, and the processor may compute the pose based on the pose of the marker in the image. In other embodiments, the processor tracks the pose of each surgical tool using other techniques, such as electromagnetic tracking.
In some embodiments, in addition to control components 70, control-component unit 26 comprises one or more other input interfaces (e.g., buttons). Alternatively or additionally, system 10 comprises one or more separate input interfaces (e.g., a foot pedal). In some embodiments, these input interfaces can be used, by operator 25, to control the operations performed by each surgical tool. Alternatively or additionally, these input interfaces, and/or
control components 70, can be used to control imaging system 22. For example, in some embodiments, the operator can adjust the x-y positioning of the imaging system, the zoom (i.e., the size of the field of view) of any of the imaging devices belonging to the imaging system, and/or the focus of any of the imaging devices belonging to the imaging system.
In general, processor 28 may be embodied as a single processor, or as a cooperatively networked or clustered set of processors. For example, in some embodiments, a first processor controls the movement of robotic arms 20, while a second processor performs the imageaugmentation functionality described herein.
The functionality of processor 28 may be implemented solely in hardware, e.g., using one or more fixed-function or general-purpose integrated circuits, Application-Specific Integrated Circuits (ASICs), and/or Field-Programmable Gate Arrays (FPGAs). Alternatively, this functionality may be implemented at least partly in software. For example, processor 28 may be embodied as a programmed processor comprising, for example, a central processing unit (CPU) and/or a Graphics Processing Unit (GPU). Program code, including software programs, and/or data may be loaded for execution and processing by the CPU and/or GPU. The program code and/or data may be downloaded to the processor in electronic form, over a network, for example. Alternatively or additionally, the program code and/or data may be provided and/or stored on non-transitory tangible media, such as magnetic, optical, or electronic memory. Such program code and/or data, when provided to the processor, produce a machine or special-purpose computer, configured to perform the tasks described herein.
IMAGE AUGMENTATION
Reference is now made to Figs. 2A-C, which are schematic illustrations of images 34 augmented to show a range of a tool during the surgical procedure on eye 42, in accordance with some embodiments of the present invention.
In some embodiments, processor 28 (Fig. 1) is configured to calculate, for at least one point or range of points on the cornea of eye 42, a range of a tool within the eye, provided that the tool enters the eye via the point or via any point along the range of points. The processor is further configured to augment at least one image 34 of the eye, such as an overhead image of the eye acquired by a microscope camera belonging to imaging system 22 (Fig. 1), by demarcating the range of the tool in the image, and to display the image on display 24. In some embodiments, the augmented image is a three-dimensional image, and the range of the tool is
demarcated with a three-dimensional bounding region 38. (For ease of illustration, Figs. 2A-C show bounding region 38 in two dimensions.) In some embodiments, the processor also marks the point or range of points in the image.
In some embodiments, the range of the tool is demarcated prior to an incision being made at the point or along the range of points, to help the operator decide whether to make the incision at that location. In some such embodiments, the processor calculates the range of the first tool that is to enter the eye, via the incision, following the making of the incision. Alternatively, the processor calculates the range of any other tool, such as a phacoemulsifier, that is to enter the eye via the incision.
For example, Fig. 2A depicts a scenario in which a cutting tool 36 is to be used to incise the cornea of the eye. As cutting tool 36 is moved toward and/or across the cornea prior to the incision, the processor calculates and demarcates the range of another tool (e.g., a phacoemulsifier) for one or more candidate incision points 40, which are optionally marked in the image. For example, in some embodiments, the processor calculates and demarcates the range of the other tool in response to the tip of the cutting tool pointing toward point 40 and, optionally, being within a predefined distance (e.g., 2 mm) of the cornea.
Alternatively or additionally, the range of the tool is demarcated after an incision is made at the point or along the range of points, typically as the tool approaches the incision and/or after the tool has entered the eye via the incision. The processor identifies the location of the incision by tracking cutting tool 36 (i.e., tracking the pose of the cutting tool) and eye 42, as described above with reference to Fig. 1, as the cutting tool makes the incision. Typically, the processor marks the incision in the image.
In some embodiments, the demarcation of the range of the tool is done for a single point along the incision, such as the point via which the tool enters the eye, or for multiple discrete points along the incision. For example, in Fig. 2B, an incision 44, which has been made in the cornea, is marked in image 34. Incision 44 spans a range of points at which a cannula 46 can enter the eye. As cannula 46 approaches the incision and/or after the cannula has entered the eye via the incision, the processor demarcates the range of the cannula for two points along the incision, each point being at or near a respective end of the incision.
In other embodiments, as shown in Fig. 2C, the range of the tool is demarcated assuming the tool can enter the eye via any point along the incision, such that the demarcated range is a
union of the respective ranges of all points along the incision.
Similarly, in some embodiments, prior to an incision being made, the operator defines a candidate incision that spans a range of points. Optionally, the processor marks the candidate incision in the image. The processor then calculates and demarcates the range of the tool for one or more discrete points along the candidate incision, as in Fig. 2B, or for the entire incision, as in Fig. 2C.
Reference is now made to Fig. 3A, which is a schematic illustration of an optical coherence tomography image 48 augmented to show an incision 44 made in eye 42 during the surgical procedure on the eye, in accordance with some embodiments of the present invention.
As described above with reference to Figs. 2B-C, in some embodiments, the processor is configured to mark at least one corneal incision 44 in image 34, which in some embodiments includes an overhead image of the eye acquired by a microscope camera. Alternatively or additionally, the processor is configured to mark incision 44 in optical coherence tomography image 48, which typically shows a side view of a portion of the eye from a vantage point along a central axis 50 of the eye. In some embodiments, the processor overlays image 48 over a corner of image 34.
In some embodiments, the processor is further configured to mark, in image 34 and/or optical coherence tomography image 48, the point along the incision at which a tool, such as cannula 46, is inserted into the eye. For example, the processor may display a circle 52 around this point or a line next to this point. Typically, the point of insertion functions as a virtual pivot point, in that rotational movements of the corresponding control tool 71 (Fig. 1) cause the tool to pivot with respect to this point, as described in co-assigned US Patent Application Publication 2023/0240779 to Golan, whose disclosure is incorporated herein by reference. (The aforementioned reference uses the term “remote center of motion” in place of “virtual pivot point.”) Thus, typically, the marking of the insertion point helps the operator control the tool.
Reference is now made to Fig. 3B, which is a schematic illustration of automatic feedback with respect to incision 44, in accordance with some embodiments of the present invention.
In some embodiments, in addition to marking incision 44, the processor is configured to compare the shape of the incision to a canonical shape 54, such as that of a three-planar incision, e.g., by performing a cross-correlation of an image of the incision with the canonical
shape. Based on the comparison, the processor displays an output indicating the deviation of the shape of incision 44 from canonical shape 54, thus helping the operator improve his or her incision-making skills.
Alternatively or additionally to marking a corneal incision, the processor marks an incision, such as a capsulorhexis incision, in the lens capsule of the eye, e.g., as shown in Fig. 6. In some embodiments, the processor identifies the location of the incision by tracking the cutting tool (which may comprise, for example, forceps, a laser-assisted capsulorhexis tool, or a zepto-rhexis tool) and eye 42, as described above with reference to Fig. 1, as the cutting tool makes the incision. In other embodiments, the processor identifies the incision by processing images of the eye, e.g., using an edge-detection algorithm.
Alternatively or additionally to marking the incision in the lens capsule, the processor computes at least one measure of deviation between the incision and the planned incision, based on the respective shapes and/or positions of the incision and the planned incision. The processor then displays an output indicating the measure of deviation.
Reference is now made to Fig. 4, which is a schematic illustration of image 34 augmented to show the size of a pupil 56 of eye 42, in accordance with some embodiments of the present invention.
In some embodiments, during the surgical procedure, as pupil 56 dilates, the processor tracks the border 58 of the pupil, e.g., using an edge-detection algorithm. Based on the tracking, the processor augments image 34 with an indication of the size of the pupil. For example, in some embodiments, the processor marks border 58. Alternatively or additionally, the processor displays the size (e.g., the pupil diameter) in association with the image, e.g., over a portion of the image near the pupil.
Reference is now made to Fig. 5, which is a schematic illustration of image 34 augmented to show locations in the eye through which a phacoemulsifier 60 passed, in accordance with some embodiments of the present invention.
In some embodiments, the processor tracks phacoemulsifier 60, as described above with reference to Fig. 1, as the phacoemulsifier phacoemulsifies the lens of eye 42. Based on the tracking, the processor indicates, in image 34, the locations in the eye through which the phacoemulsifier passed, thereby facilitating a more efficient phacoemulsification. For example, in some embodiments, the processor demarcates the range of at least some of the locations in
the image, e.g., with a three-dimensional bounding region 64. (For ease of illustration, bounding region 64 is shown in two dimensions.) Alternatively or additionally, the processor displays a curve 66 that passes through at least some of the locations.
In some embodiments, the processor further identifies, by processing the image (e.g., using a segmentation algorithm), one or more detached portions 68 of the lens that were not yet aspirated by phacoemulsifier 60, and marks portions 68 in the image.
It is noted that Fig. 5 also shows a chopper 62, which facilitates the phacoemulsification.
Reference is now made to Fig. 6, which is a schematic illustration of image 34 augmented to facilitate the implantation of an intraocular lens 72 in an astigmatic eye 42a, in accordance with some embodiments of the present invention.
In some embodiments, while intraocular lens 72 is implanted in astigmatic eye 42a, the processor marks, in image 34, the astigmatism axis 74 of the eye, which is typically defined by the operator prior to the procedure. Typically, the processor marks astigmatism axis 74 with a line, which is centered between two additional parallel lines. Typically, the processor also marks the visual axis 76 of the eye, which goes into the page in Fig. 6. In some embodiments, the processor also overlays a marker 78 over the planned capsulorhexis incision and/or actual capsulorhexis incision.
As described above in the Background, intraocular lens 72 is typically inserted through the capsulorhexis incision in a folded configuration. Fig. 6 shows the intraocular lens after the subsequent unfolding of the lens. At this stage of the implantation, the operator needs to align the center of the lens with visual axis 76, and also needs to rotate the lens such that an alignment marker 80 on the lens is aligned with astigmatism axis 74. To facilitate this rotation, in some embodiments, the processor marks multiple angles 82 around the intraocular lens (e.g., at the limbus of the eye), thereby indicating to the operator how much the intraocular lens needs to be rotated in order for alignment marker 80 to be aligned with the astigmatism axis. For example, in the scenario depicted in Fig. 6, the required rotation is around 60 degrees in the clockwise direction.
PROTECTING SENSITIVE PORTIONS OF THE EYE
Reference is now made to Fig. 7, which is a schematic illustration of image 34 augmented to show a warning 84, in accordance with some embodiments of the present
invention.
For embodiments in which a robotic arm 20 (Fig. 1) manipulates a tool, such as cannula 46, within eye 42 under the control of the operator, there is a risk of the tool contacting a sensitive portion of the eye (e.g., the retina), referred to colloquially as a “no fly zone.” To mitigate this risk, the processor tracks the tool, as described above with reference to Fig. 1. If, based on the tracking, the processor ascertains that the tool is within a predefined threshold distance of a sensitive portion of the eye, the processor inhibits the operator from moving the tool closer to the sensitive portion of the eye.
For example, in some embodiments, the processor displays warning 84 to the operator on display 24. In some embodiments, warning 84 includes an icon and/or one or more words. Alternatively or additionally, warning 84 includes a halo, which is typically brightly colored, displayed around the cornea of the eye.
Alternatively or additionally, if the operator attempts to move the tool closer to the sensitive portion of the eye, the processor provides force feedback to the operator via controlcomponent unit 26, e.g., using techniques for force feedback described in co-assigned US Patent Application Publication 2023/0240779 to Golan, whose disclosure is incorporated herein by reference. As a particular example, in some embodiments, the processor computes a force function based on the distance of the tool from the sensitive portion of the eye and/or based on the direction in which the operator attempts to move the tool. The force function returns a force vector, which is provided by control component 70 (Fig. 1) to the operator.
Alternatively or additionally, the processor disregards the command from the operator, and does not move the tool closer to the sensitive portion of the eye. Alternatively, the processor prevents the control component from moving in the direction of the sensitive portion of the eye.
For example, in some embodiments, the processor displays warning 84 and/or begins to provide force feedback in response to the tool being within a first predefined threshold distance. If the operator nonetheless moves the tool to within a second, smaller predefined threshold distance and then attempts to move the tool even closer to the sensitive portion of the eye, the processor disregards the command from the operator or prevents the control component from moving in the direction of the sensitive portion of the eye.
It is noted that the image-augmentation functionality described herein can also be performed during a manual surgical procedure in which the surgical tools are manipulated
directly by operator 25 (Fig. 1), without the use of any robotic arms.
For some applications, in order to generate one or more of the augmentations of the images described herein, processor 28 performs image analysis on one or more image acquired by imaging system 22, in order to extract parameters from the image (e.g., features of the eye and/or movement of one or more tools) that are necessary in order to generate the augmentation. Alternatively or additionally, the computer processor receives inputs from the robotic arms 20 and/or control components 70 that are indicative of movement of one or more tools relative to the patient’s eye.
Embodiments of the invention described herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium (e.g., a non- transitory computer-readable medium) providing program code for use by or in connection with a computer or any instruction execution system, such as computer processor 28 (Fig. 1). For the purpose of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Typically, the computer-usable or computer readable medium is a non-transitory computer-usable or computer readable medium.
Examples of a computer-readable medium include a semiconductor or solid-state memory, magnetic tape, a removable computer diskette, a random-access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W), DVD, and a USB drive.
A data processing system suitable for storing and/or executing program code will include at least one processor (e.g., computer processor 28) coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments of the invention.
Network adapters may be coupled to the processor to enable the processor to become coupled to other processors or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object- oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the C programming language or similar programming languages.
It will be understood that the algorithms described herein, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer (e.g., computer processor 28) or other programmable data processing apparatus, create means for implementing the functions/acts specified in the algorithms described in the present application. These computer program instructions may also be stored in a computer-readable medium (e.g., a non-transitory computer-readable medium) that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the algorithms. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the algorithms described in the present application.
Computer processor 28 is typically a hardware device programmed with computer program instructions to produce a special purpose computer. For example, when programmed to perform the algorithms described with reference to the figures, computer processor 28 typically acts as a special purpose image- augmentation computer processor. Typically, the operations described herein that are performed by computer processor 28 transform the physical state of a memory, which is a real physical article, to have a different magnetic polarity,
electrical charge, or the like depending on the technology of the memory that is used. For some embodiments, operations that are described as being performed by a computer processor are performed by a plurality of computer processors in combination with each other.
It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof that are not in the prior art, which would occur to persons skilled in the art upon reading the foregoing description.
Claims
1. A system, comprising: a display; and a processor, configured to: during a surgical procedure on an eye, calculate, for at least one point or range of points on a cornea of the eye, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points, demarcate the range of the tool in an image of the eye, and display the image on the display.
2. The system according to claim 1, wherein the surgical procedure is a robotic surgical procedure in which a robotic arm manipulates the tool.
3. The system according to claim 1, wherein the processor is configured to demarcate the range of the tool with a three-dimensional bounding region.
4. The system according to claim 1, wherein the processor is further configured to mark the point or the range of points in the image.
5. The system according to any one of claims 1-4, wherein the processor is configured to demarcate the range of the tool prior to an incision being made at the point or along the range of points.
6. The system according to claim 5, wherein the processor is configured to calculate and demarcate the range of the tool in response to a tip of a cutting tool for making the incision pointing toward the point.
7. The system according to claim 6, wherein the processor is configured to calculate and demarcate the range of the tool in response to the tip being within a predefined distance of the cornea.
8. The system according to any one of claims 1-4, wherein the processor is configured to demarcate the range of the tool after an incision is made at the point or along the range of points.
9. A method, comprising: during a surgical procedure on an eye, for at least one point or range of points on a cornea of the eye, calculating, by a processor, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points;
demarcating the range of the tool, by the processor, in an image of the eye; and displaying the image, by the processor, on a display.
10. The method according to claim 9, wherein the surgical procedure is a robotic surgical procedure in which a robotic arm manipulates the tool.
11. The method according to claim 9, wherein demarcating the range of the tool comprises demarcating the range of the tool with a three-dimensional bounding region.
12. The method according to claim 9, further comprising marking the point or the range of points in the image.
13. The method according to any one of claims 9-12, wherein demarcating the range of the tool comprises demarcating the range of the tool prior to an incision being made at the point or along the range of points.
14. The method according to claim 13, wherein calculating and demarcating the range of the tool comprises calculating and demarcating the range of the tool in response to a tip of a cutting tool for making the incision pointing toward the point.
15. The method according to claim 14, wherein calculating and demarcating the range of the tool comprises calculating and demarcating the range of the tool in response to the tip being within a predefined distance of the cornea.
16. The method according to any one of claims 9-12, wherein demarcating the range of the tool comprises demarcating the range of the tool after an incision is made at the point or along the range of points.
17. A computer software product comprising a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to: during a surgical procedure on an eye, calculate, for at least one point or range of points on a cornea of the eye, a range of a tool within the eye provided that the tool enters the eye via the point or via any point along the range of points, demarcate the range of the tool in an image of the eye, and display the image on a display.
18. The computer software product according to claim 17, wherein the surgical procedure is a robotic surgical procedure in which a robotic arm manipulates the tool.
19. The computer software product according to claim 17, wherein the instructions cause the processor to demarcate the range of the tool with a three-dimensional bounding region.
20. The computer software product according to claim 17, wherein the instructions further cause the processor to mark the point or the range of points in the image.
21. The computer software product according to any one of claims 17-20, wherein the instructions cause the processor to demarcate the range of the tool prior to an incision being made at the point or along the range of points.
22. The computer software product according to claim 21, wherein the instructions cause the processor to calculate and demarcate the range of the tool in response to a tip of a cutting tool for making the incision pointing toward the point.
23. The computer software product according to claim 22, wherein the instructions cause the processor to calculate and demarcate the range of the tool in response to the tip being within a predefined distance of the cornea.
24. The computer software product according to any one of claims 17-20, wherein the instructions cause the processor to demarcate the range of the tool after an incision is made at the point or along the range of points.
25. A system, comprising: a display; and a processor, configured to: during a surgical procedure on an eye in which a cutting tool is used to make an incision in the eye, identify a location of the incision by tracking the tool and the eye, mark the incision in at least one image of the eye, and display the image on the display.
26. The system according to claim 25, wherein the surgical procedure is a robotic surgical procedure in which a robotic arm manipulates the cutting tool.
27. The system according to claim 25, wherein the image includes an optical coherence tomography image.
28. The system according to any one of claims 25-27, wherein the incision is in a lens capsule of the eye.
29. The system according to claim 28, wherein the processor is further configured to: compute at least one measure of deviation between the incision and a planned incision, based on respective shapes and/or positions of the incision and the planned incision, and display an output indicating the measure of deviation.
30. The system according to any one of claims 25-27, wherein the incision is in a cornea of the eye.
31. The system according to claim 30, wherein the processor is further configured to mark, in the image, a point along the incision at which another tool is inserted into the eye.
32. The system according to claim 30, wherein the processor is further configured to: compare a shape of the incision to a canonical shape, and based on the comparing, display an output indicating a deviation of the shape from the canonical shape.
33. The system according to claim 32, wherein the canonical shape is that of a three-planar incision.
34. A method, comprising: during a surgical procedure on an eye in which a cutting tool is used to make an incision in the eye, identifying a location of the incision, by a processor, by tracking the tool and the eye; marking the incision, by the processor, in at least one image of the eye; and displaying the image, by the processor, on a display.
35. The method according to claim 34, wherein the surgical procedure is a robotic surgical procedure in which a robotic arm manipulates the cutting tool.
36. The method according to claim 34, wherein the image includes an optical coherence tomography image.
37. The method according to any one of claims 34-36, wherein the incision is in a lens capsule of the eye.
38. The method according to claim 37, further comprising: computing at least one measure of deviation between the incision and a planned incision, based on respective shapes and/or positions of the incision and the planned incision; and displaying an output indicating the measure of deviation.
39. The method according to any one of claims 34-36, wherein the incision is in a cornea of the eye.
40. The method according to claim 39, further comprising marking, in the image, a point along the incision at which another tool is inserted into the eye.
41. The method according to claim 39, further comprising: comparing a shape of the incision to a canonical shape; and based on the comparing, displaying an output indicating a deviation of the shape from the canonical shape.
42. The method according to claim 41, wherein the canonical shape is that of a three -planar incision.
43. A computer software product comprising a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to: during a surgical procedure on an eye in which a cutting tool is used to make an incision in the eye, identify a location of the incision by tracking the tool and the eye, mark the incision in at least one image of the eye, and display the image on a display.
44. The computer software product according to claim 43, wherein the surgical procedure is a robotic surgical procedure in which a robotic arm manipulates the cutting tool.
45. The computer software product according to claim 43, wherein the image includes an optical coherence tomography image.
46. The computer software product according to any one of claims 43-45, wherein the incision is in a lens capsule of the eye.
47. The computer software product according to claim 46, wherein the instructions further cause the processor to: compute at least one measure of deviation between the incision and a planned incision, based on respective shapes and/or positions of the incision and the planned incision, and display an output indicating the measure of deviation.
48. The computer software product according to any one of claims 43-45, wherein the incision is in a cornea of the eye.
49. The computer software product according to claim 48, wherein the instructions further cause the processor to mark, in the image, a point along the incision at which another tool is inserted into the eye.
50. The computer software product according to claim 48, wherein the instructions further cause the processor to: compare a shape of the incision to a canonical shape, and based on the comparing, display an output indicating a deviation of the shape from the canonical shape.
51. The computer software product according to claim 50, wherein the canonical shape is that of a three-planar incision.
52. A system, comprising: a display; and a processor, configured to: during a surgical procedure on an eye, track a border of a pupil of the eye as the pupil dilates, based on the tracking, augment an image of the eye with an indication of a size of the pupil, and display the image on the display.
53. The system according to claim 52, wherein the surgical procedure is a robotic surgical procedure.
54. The system according to claim 52, wherein the processor is configured to augment the image by marking the border of the pupil.
55. The system according to any one of claims 52-54, wherein the processor is configured to augment the image by displaying the size in association with the image.
56. A method, comprising: during a surgical procedure on an eye, tracking a border of a pupil of the eye, by a processor, as the pupil dilates; based on the tracking, augmenting an image of the eye, by the processor, with an indication of a size of the pupil; and displaying the image, by the processor, on a display.
57. The method according to claim 56, wherein the surgical procedure is a robotic surgical procedure.
58. The method according to claim 56, wherein augmenting the image comprises marking the border of the pupil.
59. The method according to any one of claims 56-58, wherein augmenting the image comprises displaying the size in association with the image.
60. A computer software product comprising a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to: during a surgical procedure on an eye, track a border of a pupil of the eye as the pupil dilates, based on the tracking, augment an image of the eye with an indication of a size of the pupil, and display the image on a display.
61. The computer software product according to claim 60, wherein the surgical procedure is a robotic surgical procedure.
62. The computer software product according to claim 60, wherein the instructions cause the processor to augment the image by marking the border of the pupil.
63. The computer software product according to any one of claims 60-62, wherein the instructions cause the processor to augment the image by displaying the size in association with the image.
64. A system, comprising: a display; and a processor, configured to: track a phacoemulsifier as the phacoemulsifier phacoemulsifies a lens of an eye, based on the tracking, indicate, in an image of the eye, locations in the eye through which the phacoemulsifier passed, and display the image on the display.
65. The system according to claim 64, wherein the phacoemulsifier is manipulated by a robotic arm.
66. The system according to claim 64, wherein the processor is configured to indicate the locations by demarcating a range of at least some of the locations in the image.
67. The system according to claim 66, wherein the processor is configured to demarcate the range with a three-dimensional bounding region.
68. The system according to claim 64, wherein the processor is configured to indicate the locations by displaying a curve that passes through at least some of the locations.
69. The system according to any one of claims 64-68, wherein the processor is further configured to: identify, by processing the image, one or more detached portions of the lens that were not yet aspirated by the phacoemulsifier, and mark the portions in the image.
70. A method, comprising: tracking a phacoemulsifier, by a processor, as the phacoemulsifier phacoemulsifies a lens of an eye; based on the tracking, indicating in an image of the eye, by the processor, locations in the eye through which the phacoemulsifier passed; and displaying the image, by the processor, on a display.
71. The method according to claim 70, wherein the phacoemulsifier is manipulated by a robotic arm.
72. The method according to claim 70, wherein indicating the locations comprises indicating the locations by demarcating a range of at least some of the locations in the image.
73. The method according to claim 72, wherein demarcating the range comprises demarcating the range with a three-dimensional bounding region.
74. The method according to claim 70, wherein indicating the locations comprises indicating the locations by displaying a curve that passes through at least some of the locations.
75. The method according to claim 70, further comprising: identifying, by processing the image, one or more detached portions of the lens that were not yet aspirated by the phacoemulsifier; and marking the portions in the image.
76. A computer software product comprising a tangible non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to: track a phacoemulsifier as the phacoemulsifier phacoemulsifies a lens of an eye, based on the tracking, indicate, in an image of the eye, locations in the eye through which the phacoemulsifier passed, and display the image on a display.
77. The computer software product according to claim 76, wherein the phacoemulsifier is manipulated by a robotic arm.
78. The computer software product according to claim 76, wherein the instructions cause the processor to indicate the locations by demarcating a range of at least some of the locations in the image.
79. The computer software product according to claim 78, wherein the instructions cause the processor to demarcate the range with a three-dimensional bounding region.
80. The computer software product according to claim 76, wherein the instructions cause the processor to indicate the locations by displaying a curve that passes through at least some of the locations.
81. The computer software product according to any one of claims 76-80, wherein the instructions further cause the processor to: identify, by processing the image, one or more detached portions of the lens that were not yet aspirated by the phacoemulsifier, and mark the portions in the image.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463639644P | 2024-04-28 | 2024-04-28 | |
| US63/639,644 | 2024-04-28 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025229483A1 true WO2025229483A1 (en) | 2025-11-06 |
Family
ID=95743764
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2025/054381 Pending WO2025229483A1 (en) | 2024-04-28 | 2025-04-28 | Augmented images for facilitating ophthalmic surgery |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025229483A1 (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2017015738A1 (en) * | 2015-07-27 | 2017-02-02 | Synaptive Medical (Barbados) Inc. | Navigational feedback for intraoperative waypoint |
| WO2018167246A1 (en) * | 2017-03-15 | 2018-09-20 | Orthotaxy | System for guiding a surgical tool relative to a target axis in spine surgery |
| WO2020048511A1 (en) * | 2018-09-05 | 2020-03-12 | Point Robotics Medtech Inc. | Navigation system and method for medical operation |
| US20200330174A1 (en) * | 2019-04-17 | 2020-10-22 | Ethicon Llc | Robotic procedure trocar placement visualization |
| US20230099522A1 (en) * | 2021-09-28 | 2023-03-30 | Intuitive Surgical Operations, Inc. | Elongate device references for image-guided procedures |
| US20230240779A1 (en) | 2021-12-02 | 2023-08-03 | Forsight Robotics Ltd. | Force feedback for robotic microsurgical procedures |
-
2025
- 2025-04-28 WO PCT/IB2025/054381 patent/WO2025229483A1/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2017015738A1 (en) * | 2015-07-27 | 2017-02-02 | Synaptive Medical (Barbados) Inc. | Navigational feedback for intraoperative waypoint |
| WO2018167246A1 (en) * | 2017-03-15 | 2018-09-20 | Orthotaxy | System for guiding a surgical tool relative to a target axis in spine surgery |
| WO2020048511A1 (en) * | 2018-09-05 | 2020-03-12 | Point Robotics Medtech Inc. | Navigation system and method for medical operation |
| US20200330174A1 (en) * | 2019-04-17 | 2020-10-22 | Ethicon Llc | Robotic procedure trocar placement visualization |
| US20230099522A1 (en) * | 2021-09-28 | 2023-03-30 | Intuitive Surgical Operations, Inc. | Elongate device references for image-guided procedures |
| US20230240779A1 (en) | 2021-12-02 | 2023-08-03 | Forsight Robotics Ltd. | Force feedback for robotic microsurgical procedures |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230157872A1 (en) | Microsurgical robotic system for ophthalmic surgery | |
| Charreyron et al. | A magnetically navigated microcannula for subretinal injections | |
| US20230240779A1 (en) | Force feedback for robotic microsurgical procedures | |
| US20250017676A1 (en) | Robotic unit for microsurgical procedures | |
| Chen et al. | Semiautomated optical coherence tomography-guided robotic surgery for porcine lens removal | |
| US20250288203A1 (en) | Contactless tonometer and measurement techniques for use with surgical tools | |
| US20240307132A1 (en) | Virtual tools for microsurgical procedures | |
| US20250228705A1 (en) | Robotic capsulotomy | |
| US20230240773A1 (en) | One-sided robotic surgical procedure | |
| WO2024201236A1 (en) | Engagement of microsurgical robotic system | |
| Yang et al. | Robot-assisted subretinal injection system: development and preliminary verification | |
| WO2024231879A1 (en) | Input-receiving component for robotic microsurgical procedures | |
| WO2024176143A1 (en) | Control component for robotic microsurgical procedures | |
| WO2025229483A1 (en) | Augmented images for facilitating ophthalmic surgery | |
| CN118574585A (en) | Force feedback for robotic microsurgery | |
| US20250072986A1 (en) | Orienting image for robotic surgery | |
| WO2025196696A1 (en) | Controlling a surgical tool for performing microsurgical procedures in a robotic manner | |
| Hubschman et al. | Robotic surgery in ophthalmology |