SG Docket No.: 14843-705.600 ROBOTIC ASSISTED OPHTHALMIC SURGERY SYSTEM CLAIM OF PRIORITY [0001] This application claims priority to U.S. Provisional Patent Application No.63/514,777 entitled “ROBOTIC ASSISTED OPHTHALMIC SURGERY SYSTEM”, filed on July 20, 2023, the entire disclosure of which is incorporated herein. INCORPORATION BY REFERENCE [0002] All publications and patent applications mentioned in this specification are herein incorporated by reference in their entirety to the same extent as if each individual publication or patent application was specifically and individually indicated to be incorporated by reference. FIELD [0003] This application relates to the field of robotic assisted ophthalmic surgical systems. BACKGROUND [0004] Cataract surgery involves a suite of procedural steps including (1) creating a corneal incision, (2) removing the anterior capsule via capsulorhexis, (3) fragmenting the cataract into pieces of lens material, (4) emulsifying and aspirating the lens material using an ultrasonic surgical instrument, (5) aspirating the remaining lens material with an irrigation-aspiration (I/A) tool, and (6) inserting an intraocular lens implant. Several efforts have been investigated for partially or fully automating these steps or developing robotic systems for improving surgical outcomes. However, significant challenges remain and further development is needed to improve this procedure. SUMMARY OF THE INVENTION [0005] According to one example of the present invention, there is a computer-controlled ophthalmic robotic surgical system including: a multimodal visualization subsystem configured to acquire imaging data of an operative eye; a tool exchange system configured to hold a plurality of surgical tools; a plurality of robotic arms configured to receive and manipulate the plurality of surgical tools; an artificial intelligence (AI) subsystem configured to produce outputs using the imaging data from the visualization subsystem, including: acquired data of the operative eye, extracted patient-specific metrics from the data for the operative eye, including: anatomical landmarks, and a visualization field including the operative eye; a trajectory planner configured to receive the outputs from the AI visualization subsystem and determine a trajectory
SG Docket No.: 14843-705.600 path for a selected one of the plurality of surgical tools; and a control system coupled to the visualization subsystem, AI subsystem, and tool exchange system, in which the control system is configured to receive and execute the determined trajectory path on the operative eye, in which the trajectory path is modified in real-time based on data from one or more of the visualization subsystem and AI subsystem. [0006] According to one embodiment of this example, the multimodal visualization subsystem includes at least one optical coherence tomography (OCT) device input and a digital microscope input. [0007] According to one embodiment of this example, the surgical system further includes a docketing system configured to dock to the operative eye. [0008] According to one embodiment of this example, the plurality of surgical tools in the tool exchange system is pre-configured for use with one or more of a specific ophthalmic surgery type, patient, or surgeon. [0009] According to yet another example of the present invention, there is a method of real-time 3D automated monitoring of an operative eye, including: acquiring data of the operative eye from a multimodal visualization subsystem including at least one optical coherence tomography (OCT) device input and one digital microscope input; extracting, via an artificial intelligence (AI) subsystem, patient-specific parameters from the acquired data, in which the extracting includes: overlaying digital anatomical landmarks on the acquired data of the operative eye, and determining a surgical incision site on the operative eye; determining, via a trajectory planner, a trajectory path through the surgical incision site on the operative eye for a selected one of a plurality of surgical tools; and modifying the trajectory path in real-time via the trajectory planner based on re-acquiring the data of the operative eye from the multimodal visualization system and re-extracting the patient-specific parameters from the acquired data. [0010] According to one embodiment of the method, the OCT device scans a depth of up to ~18mm on the operative eye. [0011] According to one embodiment of the method, the visualization field of volume for the OCT device extends from a first depth to a second depth. [0012] According to one embodiment of the method, the first depth is a top of a cornea of the operative eye, wherein the second depth is a posterior capsule of the operative eye. [0013] According to one embodiment of the method, the trajectory planner reassesses the trajectory path via one or of the digital microscope input and the OCT device input of the multimodal visualization subsystem. [0014] According to one embodiment of the method, the trajectory planner reassesses the trajectory path at a speed of up to 20Hz.
SG Docket No.: 14843-705.600 [0015] According to one embodiment of the method, the multimodal visualization subsystem collects and supplies imaging data to the trajectory planner. [0016] According to one embodiment of the method, the multimodal visualization subsystem collects and supplies imaging data to the trajectory planner at speeds up to 100Hz. [0017] According to one embodiment of the method, the method further includes the trajectory planner reassessing the trajectory path via one or of the digital microscope input and the OCT device input of the multimodal visualization subsystem and the multimodal visualization subsystem collecting and supplying imaging data to the trajectory planner at a same operating frequency. [0018] According to one embodiment of the method, the method further includes the trajectory planner reassessing the trajectory path via one or of the digital microscope input and the OCT device input of the multimodal visualization subsystem and the multimodal visualization subsystem collecting and supplying imaging data to the trajectory planner at a different operating frequency. [0019] According to one embodiment of the method, patient-specific parameters include operative eye size, pupil shape, anatomical anomalies, and pathology. [0020] According to one embodiment of the method, extracting the patient-specific parameters from the acquired data further includes imaging the selected one of the plurality of surgical tools. [0021] According to one embodiment of the method, extracting the patient-specific parameters from the acquired data further includes determining a location of a tip of the selected one of the plurality of surgical tools in a visualization field of volume. [0022] According to one embodiment of the method, the trajectory path incorporates further surgical procedures including phacoemulsification and polishing of the operative eye. [0023] According to yet another example of the present invention, there is a method for automated control of an ophthalmic robotic surgical system, including: receiving, at a behavior tree layer of the surgical system, inputs from a human-robot interaction GUI for executing actions along steps of a determined trajectory path for a surgical instrument operating on the operative eye; determining a current step in the determined trajectory path; determining a next correct step in the determined trajectory path; determining a location of a tip of the surgical instrument relative to landmarks in a visualization field of volume using an OCT device of the surgical system; and executing, via a communication layer, commands to a physical layer of the surgical system to achieve the next correct step in the determined trajectory pathway. [0024] According to one embodiment of this method, inputs from the human-robot interaction GUI include commands to pause, resume, cancel, start, or finish the current step in the
SG Docket No.: 14843-705.600 determined trajectory pathway, wherein the current step is one or more of a paused, running, or idle state. [0025] According to one embodiment of this method, the physical layer includes a control PC engine actuating one or more of: a docking system, motion stage, probe of the OCT device, and hardware. [0026] According to one embodiment of this method, steps of the determined trajectory pathway emulate standard procedures used in manual cataract surgery. [0027] According to one embodiment of this method, the inputs include inputs from a user- surgeon when the surgical system is operating in a teleoperative mode. [0028] According to one embodiment of this method, the inputs include inputs from a planning PC engine when the surgical system is operating in an assistive mode. BRIEF DESCRIPTION OF THE DRAWINGS [0029] A better understanding of the features and advantages of the methods and apparatuses described herein will be obtained by reference to the following detailed description that sets forth illustrative embodiments, and the accompanying drawings of which: [0030] FIG.1 illustrates an exemplary robotic surgical system. [0031] FIG.2 illustrates an exemplary structure and layout of a surgeon-side cockpit of the robotic surgical system. [0032] FIG.3 illustrates exemplary modules for operation of the robotic surgical system during ophthalmic procedures. [0033] FIG.4A illustrates an exemplary control flow of the robotic surgical system in assistive mode. [0034] FIG.4B illustrates an exemplary control flow of the robotic surgical system in teleoperative mode. [0035] FIG.4C illustrates exemplary high-level states of a control flow for the robotic surgical system in assistive mode. [0036] FIG.5 illustrates an exemplary process flow for execution of the robotic surgical system in teleoperative mode. [0037] FIG.6 illustrates exemplary robotic manipulators and a visualization system of the robot surgical system. [0038] FIG.7 illustrates an exemplary software architecture for the robotic surgical system including a behavior tree layer. [0039] FIG.8 illustrates exemplary interfaces between the robotic surgical system and a phacoemulsification system.
SG Docket No.: 14843-705.600 [0040] FIG.9 illustrates a flow chart showing an exemplary method for a method of real-time 3D automated monitoring of an operative eye. [0041] FIG.10 illustrates a flow chart showing an exemplary method for automated control of an ophthalmic robotic surgical system. DETAILED DESCRIPTION [0042] Described herein are systems and methods for robotic assisted ophthalmic surgery. One example of a robotic surgical system for ophthalmic surgery is Polaris™ or Polaris System™, which is designed to be an exemplary robot-assisted ophthalmic surgical platform for cataract surgery/computer controlled robotic microsurgical system developed by Horizon Surgical Systems, Inc., of Malibu, California. Such an exemplary robotic surgical system may consist of two major integrated subsystems: (1) the patient-side Surgical Cart (Figure 1), which comprises a multimodal visualization (stereo digital microscopy and Optical Coherence Tomography (OCT)) system and robotic arms that include two articulated high-precision micromanipulators (“Micromanipulators”) installed on macromanipulator arms (“Macro Robotic Arms” or “Macro Arms”), which together control the movement of off-the-shelf surgical instruments and (2) the surgeon-side Cockpit, where the surgeon remotely controls the Micromanipulators via a graphical user interface or through direct teleoperation using the real-time output from the multimodal visualization system. Integrated into the Surgical Cart is a patient interface system, which constrains the eye relative to the visualization system and Micromanipulators. [0043] The robotic surgical system operates in either an assistive or a teleoperation mode. In both operating modes, the visualization system provides a high quality 3D volume scan of the operative eye and a 3D stereoscopic view of the surgical target using the digital microscope, which the surgeon uses to define a safe area for surgical instrument operation. Throughout the surgery, this area of operation is updated to reflect patient motion. In assistive mode, each surgical step is precomputed by the robotic surgical system using the multimodal visualization information. Before the execution of each step, the precomputed instrument paths are presented to the surgeon for modification and approval. All precomputed motions are supervised by the surgeon during execution, and the surgeon can intervene by pausing the execution at any time. In teleoperation mode, the surgical steps are performed by the surgeon using the surgeon-side Cockpit and a set of telemanipulators (physical joystick controls). In both modes of operation, the surgeon is in full control of the robotic surgical system and can switch between the two modes interchangeably. The robotic surgical system allows the surgeon the ability to safely cancel the assistive mode operation at any time. After canceling, the surgeon can either switch to teleoperation mode or take manual control of the surgery.
SG Docket No.: 14843-705.600 Operating Room Workflow [0044] The operating room (OR) workflow can be divided into three steps: (1) Preoperative, (2) Intraoperative, and (3) Postoperative. System initialization tasks (Table 1) are performed once per day, upon startup of the robotic surgical system. After the power-on sequence, prior to each new patient, the system tests and calibrates the robotic and visualization components against known factory settings. The workflow of the operating room when the robotic surgical system is being used is similar to the current standard of care for cataract surgery using femtosecond lasers and phacoemulsification devices. The surgeon is seated at the surgical console for the duration of the surgery and the system only operates when the surgeon allows it to. A nurse is present at the patient-side surgical Graphical User Interface (GUI) to monitor the patient and the immediate surroundings. Tables 2 and 3, below, provide a more detailed comparison of the standard of care cataract surgical procedure and the robotic surgical system cataract surgical procedure. Table 1: System Initialization (to be executed once a day or upon power-on of the system) Description of Surgical Step Notes The details of initialization may differ between System initialization systems, but the step of initialization is the Table 2: Pre-operative workflow of the Polaris System™

Description of Surgical Step Notes L r
SG Docket No.: 14843-705.600 All tools prepared, installed in Nurse installs tool magazine, and loaded into the magazine onto Surgical p d d d e .
over the operative eye Align focus of the Visualization Visualization system is System to the operative eye aligned such that the full
SG Docket No.: 14843-705.600 depth of operative eye is in focus Surgeon/Nurse requests a specific tool via the surgical GUI, and can
confirm the correct tool is
initial tools engaged. Tool exchange is performed by the Automated Instrument Exchange System Table 3: Postoperative workflow of the Polaris System™. Step Description Notes Remove docking
Move surgical arms to parking the instruction of the position surgeon via the surgical graphical user interface Move out patient Nurse performs this
Remove tool magazine to remove the tool magazine

SG Docket No.: 14843-705.600 Operating Modes Assistive Mode [0045] In assistive mode, each surgical step is precomputed by the robotic surgical system using the multimodal visualization information. Before the execution of each step, the precomputed instrument paths are presented to the surgeon for modification and approval. All precomputed instrument paths are supervised by the surgeon during execution, and the surgeon can intervene by pausing the execution at any time. The following sequence is followed for each surgical step: 1. The system acquires visualization data, processes the data to extract required information and uses it to generate a proposed surgical path with a safe area for surgical instrument operation. 2. The safe area for surgical instrument operation, as generated by the 3D eye reconstruction, is reviewed, modified if necessary, and approved by the surgeon. 3. The surgeon evaluates the recommended surgical path through the surgical GUI and modifies it, as needed. 4. When the Phacoemulsification System is used, the surgeon evaluates the recommended phacoemulsification settings and makes modifications, as needed. 5. Upon surgeon approval, the planned surgical path is sent to the robotic subsystem which begins its execution. 6. During execution, the surgeon monitors the process via system-provided real-time visual and sensory feedback. 7. If desired by surgeon, the surgeon can modify the surgical path, surgical instrument settings, and/or execution in real-time to update the system’s performance of the surgical step. 8. If desired by surgeon, the surgeon can interrupt the system, which safely relinquishes control to the surgeon who then completes the surgical step via teleoperation mode. 9. If desired by surgeon, or required by the system, the system can immediately retract the surgical instruments. Teleoperation Mode [0046] Teleoperation mode provides similar advantages to existing robot-assisted teleoperated surgical systems. In teleoperation mode, the surgical steps are performed by the surgeon using the surgeon-side Cockpit and a set of telemanipulators (physical joystick controls). Additional features specific to the Polaris System™ include: ^ The surgeon may request a new surgical instrument at any point during the surgery, using the surgical instrument exchange system.
SG Docket No.: 14843-705.600 ^ Functionalities specific to cataract surgical instruments such as fluid irrigation and phacoemulsification ultrasound power, are executed via foot pedals located at the surgeon-side Cockpit. ^ The surgeon is provided with visual feedback from the surgical system. ^ Visual cues are superimposed onto a real-time digital microscope video on the surgical GUI. ^ The safe area for surgical instrument operation, as generated by the 3D eye reconstruction, is reviewed, modified if necessary, and approved by the surgeon. ^ Surgical instrument movement is constrained to stay within a safe workspace in the eye. ^ If desired by surgeon, or required by the system, the system can immediately retract the surgical instruments. [0047] In both modes of operation, the surgeon is in full control of the robotic surgical system and can switch between the two modes interchangeably. The robotic surgical system allows the surgeon the ability to safely cancel the assistive mode operation at any time to switch to teleoperation mode and take manual control of the surgery. System Diagram: Patient-side [0048] The patient side of the robotic surgical system consists of the eye stabilization subsystem (e.g., described with respect to the patient interface: eye stabilization subsystem), the visualization subsystem (e.g., described in more detail with respect to visualization system – motion platform and micromanipulators), and the robotic subsystem (e.g., described in more detail with respect to the visualization system – digital microscope and visualization system - OCT). FIG.1 provides a diagram of the major components of the patient side of an embodiment of the robotic surgical system. System Diagram: Surgeon-Side [0049] In certain examples, the surgeon-side cockpit may be the command center for the robotic surgical system and provide the primary control interface for the surgeon during the operation. Features of the surgical robotic system such as real-time intraoperative control of the robot (in teleoperation mode), supervision and evaluation of the precomputed surgical paths (in assistive mode), and system monitoring, is controlled by the surgeon via the surgeon-side cockpit. FIG.2 depicts a high-level diagram of an embodiment of the surgeon-side Cockpit and the information flow between the surgeon-side Cockpit and the patient-side Surgical Cart. Sterility Draping Procedure [0050] The draping procedure has three general steps:
SG Docket No.: 14843-705.600 1. Draping of the visualization system 2. Draping of the robotic arms 3. Draping of the patient [0051] For draping of the visualization system, the nurse applies custom-fit surgical drapes that have two exposed cut-outs: (1) for the Common Main Objective (CMO) lens of the visualization system, and (2) for the mechanical interface that allows the eye stabilization system to mount directly to the visualization device. Following draping, the visualization system is positioned above the patient and adjusted by the surgeon. [0052] Draping the robotic arms consists of two separate surgical drapes: (1) a drape for each robotic arm, which covers both Macro and Micromanipulators, and (2) a drape for the surgical instrument exchange system. For draping the robotic arms, the nurse commands the robotic system to a predefined posture such that drapes can be placed around each Macro Arm. [0053] Draping of the patient proceeds according to the standard operating room procedure. The patient is positioned supine on the operating bed with their head stabilized by a headrest and surgical tape. The surgical team cleans the patient’s eyelids and the facial areas around the eye with antiseptic, a sterile drape is placed over the patient’s face and head, and the drape is secured in position. Disposable / Sterilizable Components The robotic surgical system uses both sterilizable and sterile, single-use components. All surgical instruments, as well as the instrument exchange magazine, are packaged in sterile packing and are only unpacked by the sterile nurse after the sterile field has been established by draping of the robotic surgical system. Subsystem Descriptions Patient-side Surgical Cart Surgical Cart Overview [0054] The Surgical Cart is the patient-facing subsystem for the robotic surgical system. It houses the hardware components relevant to the surgical procedure. It interfaces with the surgeon-side Cockpit to collect and exchange data to be presented to the supervising surgeon, as well as enable control by the surgeon. Visualization system – OCT [0055] The visualization system consists of two separate imaging modalities, a swept source OCT system and a digital microscope, that are moved via a motion platform. Images from both imaging modalities are displayed on the screen in the surgeon-side cockpit for the surgeon to approve pre-defined motions of the surgical instruments (in assistive mode), to move the robotic
SG Docket No.: 14843-705.600 arms (teleoperative mode), or for the surgeon to stop and retract the surgical instruments (both modes). [0056] The purpose of the OCT system may be to provide real-time, cross-sectional and depth- resolved images of the subject eye, enabling localization of surgical tools and anatomical structures in all three dimensions. [0057] The OCT system may consist of three modules: the OCT engine, the OCT scanner, and the acquisition and display software (e.g., described in more detail with respect to visualization software). [0058] The OCT engine may be based on a MEMS-VCSEL SSOCT laser. The laser output may be centered at approximately 1060nm, and sweeps at 200kHz over a 65nm bandwidth. The laser may be coupled to a fiber Mach-Zehnder interferometer, which delivers light to the OCT scanner, described below. Light returning from the scanner may be mixed with light from a local oscillator and detected by a high bandwidth balanced photoreceiver. Output from the receiver may be sampled using a high speed digitizer, allowing the system to produce A-scans at 200kHz, with an axial resolution of 9 microns (in tissue), an imaging depth of 18mm and a sensitivity of >105dB. [0059] The OCT scanner may receive light from the Mach Zehnder interferometer and directs it towards the eye. It interfaces with the digital microscope (described below) via a dichroic mirror. The scanner employs two offset galvanometers and a custom optical relay to scan an image of the fiber tip across the subject eye and collect backscattered light. Dynamic focus control is provided by a liquid lens positioned after the collimator and before the galvanometers. The dichroic mirror is positioned immediately above the common main objective lens, which is shared between the digital microscope and OCT system. The OCT scanner may produce images with a lateral resolution of 26 microns over a lateral FOV of 28mm. Dynamic focus control allows the focus to be shifted across the entire 18mm imaging depth (and beyond). Visualization system - Digital Microscope [0060] The purpose of the digital microscope in certain examples may be to provide real-time, 3D color visualization of the surgical field, and oblique illumination of both the sclera and anterior chamber as well as red reflex illumination to deliver light to the retina to back-illuminate the crystalline lens and capsule. [0061] The digital microscope may also consist of three modules: the illumination system, the optical train, and the acquisition and display software (e.g., described in more detail with respect to visualization software). [0062] The illumination system consists of four oblique illuminators and two red reflex illuminators. All four illuminators are based on LEDs coupled to multi-mode fibers. The oblique
SG Docket No.: 14843-705.600 illuminators are positioned around the objective lens. They are configured as “fly’s eye” illuminators, which produce highly uniform illumination and minimizes tool shadowing and other directional effects to ensure consistent illumination across the surgical field. [0063] Red reflex illumination is provided by an integrated, dual beam system, with one beam co-axial with each camera channel. The beam size at the pupil plane is continuously variable between 6mm and 12mm via an optical zoom system. In addition, the angle of attack of the red reflex illumination relative to the vantage of each camera is adjustable. While this angle is nominally set to 0 degrees to the corresponding channel, it can be increased to up to 2 degrees (in the superior-inferior direction) to provide improved contrast. The red reflex illumination uses relatively narrow red-orange illumination from an LED, with a spectral output concentrated between 550nm and 700nm. [0064] The microscope’s optical train consists of several custom designed lenses, as well as dichroic mirrors and beam splitters for folding in the OCT and red reflex systems, respectively. The microscope employs a standard common main objective (CMO) arrangement, with a 175mm working distance. Visualization system - Motion Platform [0065] The purpose of the motion platform is to adjust the position of the visualization system relative to the operative eye. The motion platform enables operation on either eye using one stationary cart position. [0066] The motion platform is an electromechanical system consisting of motorized multi axis linear stages. The motion platform can be divided into two sections: 1. A large linear stage system which moves both the Macro Arms and visualization system relative to the patient 2. A smaller linear stage which moves the visualization system with respect to the Macro Arms. Micromanipulators [0067] The Micromanipulators perform the fine motions involved in cataract surgery. Each Micromanipulator consists of a multiple degree-of-freedom actuated robotic serial manipulator. These manipulators act equivalent to the hands of a surgeon to perform the surgical procedure steps. They provide all motor torques required to manipulate the surgical instrument within the eye. [0068] The Micromanipulator is based on the principle of a remote-center-of-motion (RCM). The Micromanipulators move surgical instruments, held in the Universal Surgical Instrument
SG Docket No.: 14843-705.600 Adaptor (e.g., described in more detail with respect to Universal Surgical Instrument Adapter section), such that their tip is always pointing directly towards or away from a fixed point in space (RCM). As such, the Micromanipulator is considered a spherical mechanism. Each robot joint either rotates the surgical instrument around the RCM, or moves the instrument directly towards or away from the RCM. All joints cause rotation about, or translation directly towards/away from, a fixed point in space. The Micromanipulator cannot move the RCM. For all surgical steps requiring intraocular instrument motions, the Macromanipulator Arms via the Visualization System align the RCM of the Micromanipulator with the surgical incision. The Micromanipulator also contains actuators to enable actuation of specific surgical instruments (e.g., injection or aspiration), when needed. [0069] The base of the Micromanipulator is rigidly connected to the distal end of the Macromanipulator Arms. The opposite end (distal to the Macro Arm) of the Micromanipulators connects to the Universal Surgical Instrument Adaptor, which holds the surgical instrument. The entirety of the Micromanipulator is located behind the sterile field and does not make direct contact with the surgical instruments or the patient. One or two Micromanipulators are used for each step of the surgery. [0070] CAD models of prototype Micromanipulators are shown in FIG.6, which shows two micromanipulators 604 positioned to perform a surgical substep. The micromanipulators may be attached to a Universal Surgical Instrument Adaptor which may hold a surgical instrument. Macromanipulator Arms [0071] The purpose of the Macromanipulator Arms (or “Macro Arms”) may include holding and moving the Micromanipulators. The Macro Arms translate and orient the base of the Micromanipulators such that they are at the desired locations for each surgical step. The Macro Arms are held fixed while the surgical instruments are operating within the eye. The Macro Arm also moves the Micromanipulator to perform a surgical instrument exchange, when required. The Macro Arms are moved to a pre-configured position to allow for convenient draping of the system prior to surgery. The Macro Arms collapse to have a smaller footprint when not in use. Subsystem Description [0072] The Macro Arms refer to two multi degree-of-freedom (DOF) robotic arms, each terminating with a Micromanipulator (e.g., described in more detail herein in the micromanipulator section). The Macro Arms are mounted on a cantilever beam that is positioned with the motion platform (e.g., described in more detail with respect to the visualization system – motion platform). The bases of the Macro Arms are positioned between the Surgical Instrument Exchange System and the visualization system. At the distal end of each Macro Arms, there may be a mechanical interface that rigidly couples with a Micromanipulator.
SG Docket No.: 14843-705.600 [0073] Base Positioning: The Macro Arms position the base of the Micromanipulators such that both the left and right eye of the patient can be operated on from a single Surgical Cart position. [0074] Surgical Instrument Exchange: During relevant steps of the operation, the Macro Arms position the Micromanipulators to the Surgical Instrument Exchange System to perform a surgical instrument exchange. [0075] Draping: The Macro Arms are positioned such that draping of the system is easily performed. [0076] Stowing: The Macro Arms pose in a specific orientation such that the footprint of the system is minimized. Universal Surgical Instrument Adapter [0077] The Universal Surgical Instrument Adapter is a multi-component subsystem that enables the Polaris System™ to switch between the variety of surgical instruments required for cataract surgery. Different components of the subsystem transmit torque from the Micromanipulator (outside the sterile field) to surgical instruments (within the sterile field) for rotation and instrument-specific actuation, as well as holding and alignment of the surgical instruments. Subsystem Description [0078] The Universal Surgical Instrument Adapter may be divided into three components: the sterile plate, universal holder, and instrument sleeve. [0079] The sterile plate is a component of the sterile barrier that attaches to the distal end of the Micromanipulator. It is responsible for transmitting torque from the Micromanipulator to the universal holder without physical contact between those two components. It features a series of kinematic and magnetic features that enable quick and precise attachment and detachment of the universal holder. [0080] Each surgical instrument can either be held directly by a universal holder or with an intermediate instrument sleeve that fits into the universal holder. For surgical instruments where an instrument sleeve is used, a custom sleeve adapts the geometry of the surgical instrument to that required by the universal holders. The instrument sleeve-surgical instrument connection is rigid. [0081] The universal holder is a component within the sterile field. Each surgical instrument used in the surgery is held in an instrument sleeve in a universal holder. The universal holder is initially connected to the Surgical Instrument Exchange System (described with respect to the Surgical Instrument Exchange System section). Combined motions of the Micromanipulator and Macro Arms enable retrieving/returning individual universal holders from/to the surgical instrument exchange system as needed for use in the surgery. The universal holder receives the torque transmitted by the sterile plate and uses it to achieve functionality specific to a surgical
SG Docket No.: 14843-705.600 instrument such as rotation or injection. The universal holder contains a quick connect/disconnect interface to grasp and align a surgical instrument or instrument sleeve. Surgical Instrument Exchange System [0082] The Surgical Instrument Exchange System provides surgical instruments to the Micromanipulators and holds surgical instruments in standby when not in use. The Surgical Instrument Exchange System is the mechanism from which the robotic system can switch between different surgical instruments during operation. Subsystem Description [0083] The Surgical Instrument Exchange System is an electromechanical system with an associated control framework that manages the exchange of surgical instruments during surgery. The surgical instruments are requested via the surgical graphical user interface (GUI). [0084] The Surgical Instrument Exchange System holds all surgical instruments that are intended to be used during surgery and is separated into two components: the Rotation Platform and Rotary Instrument Carousel. The Rotation Platform is statically mounted to the Surgical Cart and houses the electronics and motors of the system. Prior to surgery, the sterile nurse installs the Rotary Carousel, holding all required surgical instruments, onto the Rotation Platform of the Rotary Carousel. Each surgical instrument can be detached from and reattached to the Carousel during the course of a surgery. [0085] Rotation of the Rotary Carousel is performed by a motor embedded in the Rotation Platform, and the angular position is measured by encoders. Each surgical instrument assembly is tagged such that the system tracks the identity of each instrument on the Carousel. When a new surgical instrument is requested via the surgical GUI on the surgeon-side Cockpit, the Surgical Instrument Exchange System rotates to present the requested surgical instrument in an easily accessible location to the robotic arms. Upon the completion of a surgical step, or as requested by the surgeon, the robotic system replaces the surgical instrument into an empty slot on the Rotary Carousel. Patient Interface: Eye Stabilization Device [0086] The purpose of the eye stabilization device is to constrain the patient’s eye within the field of the visualization system during intraocular surgery. The eye stabilization device may provide the following functionality to the robotic surgical system: ^ Resist patient eye movement ^ Resist forces due to motion of the surgical instrument ^ Hydrate the eye at known intervals [0087] A significant difference between the eye stabilization device of the robotic surgical system and marketed patient interfaces is that the eye stabilization system provides all required
SG Docket No.: 14843-705.600 functionality while simultaneously allowing the two surgical instruments to access the required incision points during surgery. Subsystem Description [0088] The eye stabilization device consists of two separate components: the disposable docking ring, and the non-disposable docking arm. [0089] The docking ring makes direct contact with the patient’s eye and provides suction and hydration via two external hydraulic lines. The docking ring is quickly connected and disconnected from the docking arm via a magnetic and kinematic coupling. The docking ring comes in a number of sizes to accommodate different eye shapes and sizes. [0090] The docking arm provides a mechanical connection between the docking ring and the visualization system. It is designed such that the docking ring (and therefore the patient’s operative eye) is concentric with and confocal to the visualization system. The docking arm can be rotated to accommodate operations on both left and right eyes. [0091] The docking arm allows for small motions of the docking ring along the eye’s central (vertical) axis, to (a) accommodate small motions of the patient’s eye without applying undue pressure on the cornea, and (b) to allow adjustment of the visualization system’s focal plane. In the case that large patient motion is detected, the docking arm includes an emergency retraction mechanism that safely retracts the eye stabilization device. Patient-side Computer Hardware [0092] The purpose of the patient-side computer hardware is for hosting software and its communication with corresponding robotic hardware. Subsystem Description [0100] The patient-side computer hardware may include: - Low-level control computer hardware - One (1) PC hosts the operating system, software, and interfaces for processing and communication with Macro Arms and Micromanipulators - One (1) controller cabinet for each Macro Arm - One (1) controller cabinet for each Micromanipulator - Central command computer hardware - One (1) PC hosts the operating system, software, and interfaces for processing and communication with all other subsystems - Visualization computer hardware - One (1) PC hosts the operating system, software, and interfaces for processing and communication with OCT/DM subsystem and Trajectory Planning subsystem - One (1) controller for laser - One (1) controller for camera - Phaco machine host PC
SG Docket No.: 14843-705.600 - One (1) PC hosts the operating system, software, and interfaces for processing and communication with the phacoemulsification system Peripheral Sensors - Patient Motion Sensor [0101] The robotic surgical system may incorporate an inertial sensor for monitoring patient motion during surgery. This sensor will provide a redundant measure of patient motion in addition to monitoring the eye through the visualization system. [0102] The sensor used to monitor patient motion is an inertial sensor, which is a combination of an accelerometer, gyroscope, and magnetometer. This sensor will be mounted to the patient’s head prior to the onset of the surgery. The primary use of the patient monitoring sensor is to trigger an emergency retract. In scenarios where the patient suddenly moves their head, the inertial sensor will trigger an emergency condition in the Central Command PC such that all surgical tools and the patient interface will immediately retract, and the robotic surgical system will assume a safe configuration. Surgeon-side Cockpit Surgeon-side Cockpit overview [0103] The surgeon-side Cockpit is the command center for the robotic surgical system. It provides the primary control interface to the surgeon during the operation. In addition to visualizing and validating the safe area of operation, every feature of the surgical robotic system such as real-time intraoperative control of the robot (in teleoperation mode), supervision and evaluation of the precomputed surgical paths (in assistive mode), and system monitoring, may be controlled by the surgeon via the surgeon-side Cockpit. A high-level overview diagram of the surgeon-side Cockpit and the information flow with the patient-side Surgical Cart is depicted in Figure 2. As discussed in more detail above with respect to the operating modes, the robotic surgical system may be used in two Operating Modes. Each mode is operated by the surgeon through the Cockpit using the methods below: Operating Mode 1: Assistive [0104] In assistive mode, the surgeon primarily interacts with the patient-side Surgical Cart via the touchscreen surgical GUI (e.g., described in more detail with respect to surgeon display – 3D monitor). Telemanipulators (joystick controls) are used in circumstances in which the surgeon decides to take control of the system, at which time the robot-assisted operations immediately cease. This hand-over of operational control is described in detail with respect to the multi-modal human robot interaction section. Operating Mode 2: Teleoperation [0105] In teleoperation mode, the surgeon directly controls all surgical operations:
SG Docket No.: 14843-705.600 ^ Surgical instrument position is controlled via the telemanipulators (e.g., described in more details with respect to the input device: telemanipulator section) ^ Functions specific to a surgical instrument (phacoemulsification parameters, IOL injection, OVD injection, and irrigation/aspiration/ultrasound) are controlled via the Input Device Foot Pedal (e.g., described in more detail with respect to the input device: foot pedal section) ^ Adjustments to the focal plane of the visualization system are adjusted via the Input Device Foot Pedal (e.g., described in more detail with respect to the input device: foot pedal section). ^ Surgical instrument exchange requests are submitted by the surgeon via the surgical GUI (e.g., described in more detail with respect to the surgeon display: 3D monitor section) Input Device: Telemanipulator [0106] In teleoperation mode, the purpose of the surgeon-side Cockpit is to afford the surgeon direct control of motion of the surgical instrument tip through the use of telemanipulators that capture the surgeon’s hand motion. Regardless of the operating mode, the surgeon can always take control of the surgical system by using the telemanipulators. Subsystem Description: [0107] When the Polaris System™ is being operated in teleoperation mode, the surgeon will utilize two telemanipulators (joystick controls) for controlling two surgical instruments from the surgeon-side Cockpit. The input motions from the telemanipulator are gathered from the telemanipulator’s joint encoders, processed by the Cockpit PC (e.g., described in more detail with respect to cockpit computer hardware section), and transmitted to the surgical system for real-time control of the surgical instrument. [0108] The robotic surgical system may have an input device that includes two 7 degree-of- freedom (DOF) telemanipulators, each of which terminate with a gripper that is used for actuating surgical instruments and interacting with the surgical GUI. [0109] The raw signals from the telemanipulator are passed through a series of software algorithm steps in order to control the surgical robotic system: [0110] Teleoperation: The 7DOF telemanipulator data is transformed into the 4DOF space such that the surgeon has a sense that they are holding the surgical instrument and can intuitively control the surgical robotic system by looking at the 3D monitor. [0111] Input motion scaling: The surgical robotic system only moves a fraction of the distance moved by the telemanipulator gripper. [0112] Tremor reduction: The surgeon’s input motions, as measured by encoders within the telemanipulator, are sent through an algorithm to remove tremor.
SG Docket No.: 14843-705.600 Input Device: Foot Pedals [0113] Purpose: Foot pedals are used to enable surgeon control of various functions that are not already controlled by the telemanipulator input device. [0114] The two foot pedals used with the robotic surgical system are multi-input foot pedals that similar to those used in the current standard of care for cataract surgery. One foot pedal is from the marketed phacoemulsification system connected to the robotic surgical system. This foot pedal is unmodified from its marketed state and is used as-is within the surgeon-side cockpit (e.g., described in more detail herein with respect to the phacoemulsification section). All functionalities of the marketed Phaco system will be operational and available for the surgeon during Teleoperation mode. [0115] The other multi-input foot pedal is a multifunction foot pedal designed for ophthalmic visualization systems and allows the surgeon control of the visualization system (e.g., zoom, focus) corneal hydration, enabling/disabling (clutching) as well as telemanipulator control of the system. These functions are similar to those controlled by foot pedals for marketed ophthalmic visualization systems. Surgeon Display (3D Monitor) [0116] The Cockpit may include a Surgeon Display that serves as the main medium for information exchange between the surgeon and the patient-side Surgical Cart. The Surgeon Display may have two primary purposes: 1. Real-time, 3D color visualization of the surgical field via the visualization system’s stereo digital microscope. 2. Integrated User Interface for the surgeon to monitor, modify, pause, and cancel surgical operations. [0117] The 3D monitor is located at the surgeon-side Cockpit, therefore, the surgeon seated at the Cockpit can easily visualize and interact with the system. The monitor is height-adjustable to accommodate surgeons of different heights. The 3D monitor requires the surgeon to don a pair of 3D glasses, which enables the stereopsis “3D” effect of the monitor. Surgical Graphical User Interface (GUI) [0118] The primary modality through which the surgeon can interface with the robotic surgical system is the surgical Graphical User Interface (GUI). A full-featured GUI is incorporated into the surgeon-side Cockpit, where the surgeon can monitor, modify, visualize, and execute the procedure via a screen with 3D visualization that can be controlled with a telemanipulator and/or mouse. The nurse can also use the surgical GUI located at the Patient-side Cart to intervene in the surgical procedure, if necessary.
SG Docket No.: 14843-705.600 [0119] Through the surgical GUI, the surgeon can approve surgical paths suggested by the system and adjust a variety of parameters that include (but are not limited to) the phacoemulsification, digital microscope, and surgical instrument exchange requests. All crucial patient health information is displayed to the surgeon on the GUI, and a speaker is incorporated to provide auditory feedback. In addition to a physical emergency stop button, the GUI provides an additional easy-to-access interface to conduct an emergency stop during procedure. The GUI also provides the interface for the surgeon to switch between operating modes (e.g., described in more detail with respect to the Operating Modes section). An embodiment of the robotic surgical system’s surgical GUI 721 is presented in FIG.7. [0120] The surgical GUI has the capability to be reconfigured based on surgeon preferences and focus only on specific views or information, such as the digital microscope. As surgical steps change throughout the procedure, the functions specific to a surgical instrument that display on the surgical GUI change as well to only display information that is relevant to the current step. Cockpit Mechanical Hardware (Frame) [0121] The purpose of the Cockpit Frame is to serve as a mechanical structure to which the two telemanipulators mount. The frame is a structure that provides the surgeon a comfortable structure to rest their arms while operating in either assisted or teleoperated mode. [0122] The Cockpit Frame encapsulates all the hardware related to the Surgeon-side Cockpit telemanipulators and foot pedals and ensures that they are mounted to a rigid structure. Additionally, the frame houses both the third-party phacoemulsification system pedal. The Cockpit Frame is mobile and can be easily moved around the operating room. Once a location in the room is set by the operating room staff, it can be locked in place for use during surgery. Cockpit Computer Hardware [0123] The Cockpit utilizes two computers, running the following Operating Systems respectively: 1. Real-time Ubuntu 20.04 LTS The Linux kernel of Ubuntu 20.04 LTS is real-time patched, and performance has been benchmarked by cyclic test. The computer connects to haptic input devices and other subsystems via USB cables and Ethernet cables. 2. Windows 10 Professional 64bit The Windows PC serves as a display PC for video streams from the Visualization System. [0124] The computer running the real-time Ubuntu operating system serves as the host PC for controlling haptic input devices, and communicating with the low-level control PC, which
SG Docket No.: 14843-705.600 controls the micromanipulators on the patient-side Surgical Cart. The second computer, running the Windows 10 operating system, serves as the PC for displaying images/video streams. Software System Architecture Design [0125] The system architecture of the robotic surgical system consists of three main blocks: surgeon-side Cockpit, operating room, and server room. Each block contains a number of computing units, which contain software packages and manage the communication between each unit. In this partially distributed system, most responses are event triggered, such that a number of agents (e.g., direct control from a surgeon, emergency stops, and surgeon-monitored execution of precomputed trajectories) are able to share control of the system. [0126] According to certain examples, subsystems of the robotic surgical system may run Ubuntu and the PCs that require real-time performance have their computing kernels real-time patched. Software Architecture Design [0127] The software architecture of the robotic surgical system takes into consideration reusability, scalability, maintainability, and modifiability. FIG.7 shows an embodiment of the overall software architecture that connects the human surgeon to the robotic system. Overall, the Polaris System™ software architecture consists of four layers (FIG.7): 1. Human–Robot Interaction 702 2. Behavior-Tree-based Task Management 704 3. Communication Layer (Middleware) 706 4. Physical Layer (Subsystems) 710 Layer 1: [0128] The Human–Robot Interaction layer 702 serves as an overarching mechanism for human surgeons to start, pause, resume, and cancel operations of the robotic system. Specifically, it is implemented in the form of a finite state machine (FSM) with all transitions between states defined. The software resides in the surgeon’s graphical user interface (GUI), which enables the surgeon to execute necessary functions during the surgery such as supervising the surgery, modifying trajectories (e.g., lens grooving depth or corneal incision tunnel length), and specifying parameters (e.g., ultrasound power or surgical instrument speed). Layer 2: [0129] The Task Management/behavior tree layer 704 serves as the core layer for assisted surgery mode. The Task Manager, with surgeon approval, executes the desired surgical flow.
SG Docket No.: 14843-705.600 The task sequence and logic are manifested as a behavior tree (BT). Via the task management visualization GUI, the surgeon can modify and monitor the surgical procedure in real-time. Layer 3: [0130] The Communication layer 706 handles communication between distributed computing units. The layer is implemented as software commonly referred to as “middleware,” and its communication protocols include publish/subscribe, client/service, and shared data/memory. Layer 4: [0131] The Physical layer 710 in the robotic surgical system refers to software residing in different subsystems (e.g., low-level control). In general, these subsystems receive commands via Layer 3 from Layer 2, execute corresponding actions, and return status via Layer 3 to Layer 2, to trigger the next step. Multimodal Human-Robot Interaction [0132] The system may be operated either in (a) assistive mode guided by the behavior tree (BT) or (b) teleoperation mode. To switch between operating modes, the system requires a stop (“suspend”) action and if necessary, transition to a recovery mode, then to the same or another operating mode. For example, if the surgeon wishes to take over the operation while the system is operating under assistive mode, they first pause the system, then switch to teleoperation mode. To go the opposite direction (teleoperation mode to assistive mode), the “recovery” state exists to safely handle the details of that transition. Implementation [0133] A hierarchical priority is also implemented for local decision making for switching between modes. The multimodal human-robot interaction is implemented in a finite state machine (FSM) which exhausts the possible states, transitions, and the transitions’ triggers. [0134] The hierarchical priority is implemented as a bitfield, the value of which dictates how the system behaves. Task Management Behavior Tree [0135] To fulfill the requirements of Polaris System™ assisted surgery mode, the task management is performed using a behavior tree as the task logic management framework. Behavior tree task management is implemented to execute surgeon-monitored precomputed trajectories and surgical flow. [0136] Another critical feature in our implementation of behavior tree task management is “real- time status visualization”. Once the behavior tree is triggered to start, the GUI editor/monitor shows the system’s status in real-time by highlighting the tree node the system is currently at.
SG Docket No.: 14843-705.600 Visualization Software [0137] The visualization software manages communication with the various subsystems of the visualization system, specifically the OCT engine, the digital microscope cameras and illumination subsystem. The software is configured to provide real-time, low latency display of data streams from both imaging modalities, and to efficiently transmit this data to other software subsystems of the robotic surgical system. OCT [0138] The OCT acquisition and display software manages communication with various components of the OCT engine. Specifically, it communicates with the OCT laser (for controlling laser configuration), the Data Acquisition (DAQ) card (for generating analog drive waveforms for the galvanometers), a liquid lens driver (for controlling the focus of the OCT system), and the high speed digitizer (for analog to digital conversion of the OCT data stream). Data from the high speed digitizer is processed in real-time, leveraging GPU acceleration. Processed data is displayed to the user, and transmitted to other software subsystems, as needed. Digital Microscope [0139] The digital microscope software serves two high-level functions. First, it controls the illumination system of the digital microscope, allowing independent control of the oblique illumination and red reflex illumination channels. Illumination control is achieved via communication over a USB interface to a custom designed LED driver board, which in turn directly controls the drive current to four oblique illumination LEDs and two red reflex illumination LEDs. [0140] In addition, the digital microscope software also communicates with both cameras of the microscope. The software enables digital panning and zoom, performs real-time image processing, and then transmits the resulting stereo video feed to the 3D display and other software subsystems, as needed. Communication with the cameras is performed over a 10 GigE interface, and video is streamed at up to 20.3MP per channel and up to 60 frames per second. Image processing, including debayering, distortion correction, white balance and contrast adjustment/histogram equalization. Latency of the entire system, including latency of the 3D display (FSN), has been measured to be less than 85ms. Sensing of Anatomical Structures [0141] In each surgical step of the robot-assisted cataract surgery, the robotic surgical system performs three sub-steps: 1) planning of the surgical path, 2) real-time monitoring during execution, 3) and the assessment after the step is finished. The purpose of sensing anatomical structures is to allow the robot to perform these three tasks optimally and safely for a patient, based on the real-time monitoring of the operating volume of the eye.
SG Docket No.: 14843-705.600 Design Details [0142] The sensory information (i.e. from the Digital Microscope and OCT) gathered from the visualization system is processed with algorithms to extract high-level information, such as the location of key anatomical features. This information is then sent to the path planner modules. In certain examples, an algorithm receives an OCT scan and processes it to find the location of incision relative to the visualization system. This precomputed incision location is superimposed onto the surgical GUI such that the surgeon can either approve or modify the surgical path for the corneal incision surgical step. [0143] In certain examples, a sensing algorithm may detect relevant anatomical structures in an OCT image. Surgical Path Planning [0144] The purpose of Surgical Path Planning is to generate an optimal surgical path that satisfies the safety and efficacy requirements for each surgical step. “Surgical Path” is defined as the physical trajectory of the surgical instrument as it interacts with the eye anatomy. It includes surgical instrument location, pose, and speed as well as surgical instrument functionality such as phacoemulsification, irrigation, and injection. Subsystem Description: [0145] The Surgical Path Planning subsystem is software that receives data inputs from the visualization subsystem (RGB and OCT data) and parameter inputs from the surgeon (via the GUI). It is a sequential process that begins with acquisition of anatomical data and ends with the trajectory data of the surgical path being sent to the robotic system. [0146] In order, the steps of Surgical Path Planning in certain examples may include: 1. Raw visual data (both RGB and OCT) of the current anatomy is acquired by the visualization subsystem 2. The raw visual data is processed to extract required anatomical information such as the location of anatomical features 3. The surgical path is planned using the extracted anatomical information alongside surgeon-specified parameters such as surgical instrument speed 4. The system proposes a surgical path and the surgeon modifies and/or accepts it 5. The surgical path is communicated to the robotic system for execution Real-time Operation Low-Level Motion Control [0147] Real-time motion control of the Micromanipulator is achieved by running the low-level software on a PC with a real-time operating system (RTOS) and using the EtherCAT (Ethernet
SG Docket No.: 14843-705.600 for Control Automation Technology) protocol to exchange position and velocity commands between the Low Level PC and the motor drivers. The Low Level PC sends out commands at 1kHz to the motor drivers and receives the status of the motors (e.g., motor encoder values) from the drivers. The drivers use the command information to control the motion of the motors. Application Programming Interface (API) [0148] The function API may take responsibility for taking in commands from either engineering GUI, behavior tree-based task manager, or surgical GUI, and calling corresponding functions embedded in the low-level motion control library. In other words, the operating modes could switch in high-level transition, and the function API serves as an abstraction layer connecting high level operating modes and low-level motion control. Teleoperation Software Position Control of Surgical Instruments [0149] The teleoperation software may be developed to allow surgeons to intuitively control the surgical instrument (installed on the Micromanipulator) using the Cockpit telemanipulators (e.g., described in more detail with respect to the surgeon-side cockpit section). To accomplish this, the teleoperation software framework 109 shown in FIG.1 can be implemented. [0150] Once the surgeon moves the telemanipulator, the encoder of the telemanipulator registers the motion of the surgeon’s hand. This information is then transformed following the 11 steps to move the Micromanipulator in a way that the motion appears to be intuitive to surgeons as if they are controlling the surgical instruments. The 11 steps are summarized as follows: 1. Motion of the surgeon's hand is captured by the encoders of the telemanipulator. 2. The encoder values are filtered to remove the encoder measurement noise. 3. The position of the telemanipulator gripper is calculated using robot modeling (forward kinematics) and filtered encoder values. 4. The information from 3 is filtered to remove the tremor of the surgeon’s hand. 5. The information from 4 is scaled according to the scaling factor set by the surgeon. This scaling factor is set by the surgeon during initialization, and can be modified throughout the procedure via the Surgical GUI 6. The information from 5 is then properly mapped to obtain the end effector position of the Micromanipulator with respect to the camera. This is an essential step for the intuitiveness of the motion which appears to the surgeon. 7. The information from 6 is then transformed into the coordinate system of the Micromanipulator. 8. The information from 7 along with Micromanipulator modeling (inverse kinematics) is used to obtain the desired joint values of the Micromanipulator.
SG Docket No.: 14843-705.600 9. The information from 8 is used to obtain the desired motors positions of the Micromanipulator. 10. The motors of the Micromanipulator are commanded. 11. Micromanipulator moves such that it appears intuitive to the surgeon to control the surgical instruments in teleoperation mode. Clutching [0151] Since the workspaces and motion scales of the telemanipulator and Micromanipulator are not identical, surgeons are needed, from time to time, to perform clutching to adjust the placement of their hands in a comfortable configuration. During clutching, the Micromanipulator does not move while the surgeons adjust the position of the telemanipulator. Clutching is triggered when the surgeon presses on the clutch button next to their feet. Once the surgeon is comfortable and ready to start the operation again, they press on the clutch button again to control the surgical instrument. Surgical GUI Software [0152] The surgical GUI software is implemented in the framework of a web browser to ensure its compatibility with different potential display/hardware platforms. [0153] The surgeon and nurses are able to monitor the system status on the GUI software, which resides in hand-held touchpads. Nurses have access to a subset of functionalities the surgeon has. Peripheral Sensors Software [0154] Peripheral sensors serve as part of the data aggregation and processing pipeline, which is part of the whole system architecture. [0155] In general, the system status monitor aggregates the data from Macro Arms, Micromanipulators, and peripheral sensors. Although the data collection frequencies are different, we synchronize them according to system clocks and interpolate the missing data from sensors with lower frequencies. [0156] Communication-wise, peripheral sensors talk to the system status monitor which resides in the central command PC via USB / Ethernet. System Software [0157] The other system software, which is not native to the Linux operating system, may include (a) system health monitor; (b) local decision maker; (c) local data processor. System Health Monitor (SHM) [0158] This software aggregates data from different sources and does synchronization. Although no local decision is made, SHM streams all data to Local Decision Maker for further decision- making. Local Decision Maker (LDM)
SG Docket No.: 14843-705.600 [0159] Without sending requests for approval to the behavior tree task manager, the LDM could trigger the system’s responses in multiple ways. For events not included in the LDM’s error codes, the LDM submits the bitfield information to behavior tree task manager for further evaluation. Local Data Processor (LDP) [0160] This software is responsible for data logging and preliminary post-processing. Multiple data streams are handled in different threads and saved in the corresponding local databases. Due to the sensors’ intrinsic high frequency, sensors, raw data will not be fully recorded by LDP. Instead, post-processed “health” data, which is more human-readable, is recorded in the system journal. The other important data to be logged and processed include image and video streams. Data Management and Cybersecurity [0161] Polaris System™ data management consists of two major parts: local data storage and management (LDSM) and cloud data storage and management (CDSM). LDSM takes care of local data storage in a dedicated SATA hard drive physically residing with the system, and its corresponding data uploading, cleaning, and cybersecurity. CDSM takes care of remote data storage in dedicated cloud servers physically residing in remote data centers, and its corresponding database management and cybersecurity. Integration with Marketed Devices Phacoemulsification System [0162] Phacoemulsification systems are systems that provide functionality to a specific subset of surgical instruments. These systems typically consist of three main components: a console, a foot pedal, and a set of surgical instruments. The console houses a graphical user interface and/or physical controls and is responsible for controlling and monitoring the surgical instruments connected to it. The foot pedal allows the surgeon to control the functions of the system and the attached surgical instruments. The surgical instruments include an irrigation/aspiration handpiece and a phacoemulsification probe. Important surgical instrument functionality includes irrigation, aspiration, and/or ultrasound. Several phacoemulsification systems are currently commercially available and have received widespread acceptance into operating rooms due to their safety and ability to optimize surgical outcomes. Exemplary phacoemulsification systems suited for use with the Polaris System™ are the Bausch + Lomb Stellaris and the Alcon Centurion Vision System. Integration in the robotic surgical system: [0163] The phacoemulsification system is connected to the robotic surgical system to control its functions and to receive feedback on its status. FIG.8 shows an exemplary schematic of how the
SG Docket No.: 14843-705.600 third-party phacoemulsification system is integrated into the robotic surgical system. Some component may represent the unaltered, commercially available phacoemulsification system. [0164] In assistive mode, (e.g., described in more detail with respect to the assistive mode section) the functionality is controlled by the robotic surgical system via software through a dedicated communication cable. This is two-way communication that connects the Central Command CPU of the Surgical Cart to the phacoemulsification system. The Central Command CPU sends commands to the phacoemulsification system and the phacoemulsification system sends status messages in return. Status updates are used for assisted control and guidance. Phacoemulsification settings such as ultrasound power, irrigation, and aspiration will be proposed by the Polaris System™ for approval by the surgeon. During the process, the surgeon is able to modify and/or override all of the commands being sent to the phacoemulsification system. [0165] In teleoperation mode (e.g., described in more detail with respect to the teleoperation mode section), the standard foot pedal of the phacoemulsification system is provided to, and used by, the surgeon in the surgeon-side Cockpit. The cable connecting the foot pedal to the phacoemulsification system is unmodified and sends only surgeon foot pedal commands to the phacoemulsification system. An additional communication line is provided that returns status information from the phacoemulsification system to the Cockpit. Off-The-Shelf Surgical Instruments [0166] The Polaris System™ can be adapted for use with Off-The-Shelf (OTS) surgical instruments that are a subset of those used in current practice for cataract surgery. Due to use of the Universal Surgical Instrument Adapter (e.g., “USIA” that is described in more detail with respect to Universal Surgical Instrument Adapter section), OTS instruments can be used without modification. All instruments can be extracted safely by accounting for the instrument geometries and anatomy at the time of extraction. [0167] A list of surgical instruments that may be adapted for use by the Polaris System™ is provided in Table 4. This table also includes the tool-specific functionality and Surgical Instrument Holder configuration that is required for each surgical instrument:
SG Docket No.: 14843-705.600 Tool Manufacturer Model Tool Holder Type

Table 4: List of marketed off-the-shelf surgical instruments which can be adapted for use by the Polaris System™ via the USIA interface. [0168] In addition to the OTS Surgical Instruments to be used with the Polaris System™, a commercially available phacoemulsification system may be connected to and operated by the surgical robotic system as described herein. [0169] Other details relating to the building and organization of these systems can be found at U.S. Patent No.9,283,043, entitled “Apparatus, System, and Method for Robotic Microsurgery”, and filed Jan 7, 2013; U.S. Patent Publication No.2021/0228292, entitled “System and Method for Automated Image-Guided Robotic Intraocular Surgery”, and filed November 3, 2020; and Chen CW, Lee YH, Gerber MJ, Cheng H, Yang YC, Govetto A, Francone AA, Soatto S, Grundfest WS, Hubschman JP, Tsao TC. Intraocular robotic interventional surgical system (IRISS): Semi-automated OCT-guided cataract removal. Int J Med Robot.2018 Dec;14(6):e1949. doi: 10.1002/rcs.1949. Epub 2018 Aug 28. PMID: 30152081, the entire disclosures of which are incorporated by reference herein.
SG Docket No.: 14843-705.600 [0170] FIG.1 illustrates and exemplary robotic surgical system 100. Patient motion sensor 106 may include situational awareness features such as a wide angle camera or video recorder for situational awareness by capturing the operating environment such as an operating room. Illumination source 116 may have a red reflex feature. As shown here, there is a patient-side surgical cart 102 which may include a stander set of surgical tools 126, which may be unmodified from their commercially marketed form, or in other examples customized to a procedure, patient, or surgeon. Surgical tools 126 may be sequentially loaded and unloaded by a tool exchange system 124 onto a plurality of robotic arms such as left robotic arm 140 and right robotic arm 138, which may be uniquely designed and configured for ophthalmic surgical procedures and actuate 100A surgical tools 126. For further details on surgical tools 126, the tool exchange system 124, and robotic arms (such as left and right robotic arms 138/140), see International Patent Application No. PCT/US2024/010569 entitled “AUTONOMOUS TOOL EXCHANGE SYSTEM FOR AUTOMATED AND SEMI-AUTOMATED INTRAOCULAR SURGICAL PROCEDURES”, filed January 5, 2024; and International Patent Application No. PCT/US2024/010586 entitled “SURGICAL TOOL HOLDER FOR INTRAOCULAR ROBOTIC SURGICAL SYSTEMS”, filed January 5, 2024; the entire disclosures of which are incorporated by reference herein. [0171] A real-time control computer 132 may be configured to send control commands to the robotic arms 138/140 as well as visualization subsystem 114 and a phacoemulsification system 136, which may be a third-party phacoemulsification system, and which may be coupled or interface 100H with one or more robotic arms such as right robotic arm 138. For further details on visualization subsystem 114, see International Patent Application No. PCT/US2024/010536 entitled “SURGICAL MICROSCOPES USING RED REFLEX ILLUMINATION”, filed January 5, 2024, the entire disclosure of which is incorporated by reference herein. [0172] A central control computer 128 may be configured to manage the system’s behavior and decision-making logic, and may include a system management engine 130. Central control computer 128 may receive several inputs from several sources including, for example, a visualization and path planning computer 108, real-time control computer 132 which may actuate 100B robotic arms 140/138 via robotic arm controllers 134, and cockpit computer 103 on surgeon-side cockpit 101. Such inputs may be integrated, and in certain examples, further inputs such as real-time data from a patient motion sensor 106 and emergency stop 104. Output commands from central control computer 128 may be sent to real-time control computer 132 and cockpit computer 103 on surgeon-side cockpit 101. [0173] A visualization and path planning computer 108 may be configured to acquire multimodal imaging data from the visualization subsystem 114 and perform image processing
SG Docket No.: 14843-705.600 algorithms 110, and in certain examples, path planning algorithms, in real-time. The path planner 112 transforms the input imaging data into surgical tool path data, which is then sent 100C to the central command computer 128. [0174] Visualization subsystem 114, further described in International Patent Application No. PCT/US2024/010536 entitled “SURGICAL MICROSCOPES USING RED REFLEX ILLUMINATION”, filed January 5, 2024, the entire disclosure of which is incorporated by reference herein, may include one or more digital microscopes (DM) 120, which may be a stereo digital microscope, coupled to an illumination source 116, as well as an one or more optical coherence tomography (OCT) probe 122 or device, which may be coupled to an OCT source 118. In certain examples, digital microscope 120 may be used to provide anatomical landmarks to direct OCT scanning via OCT probe 122. Visualization subsystem 114 provides real-time 3D monitoring of the operative eye. [0175] A patient interface (docking) system 142, further described in International Patent Application No. PCT/US2024/010589 entitled “DOCKING STATION FOR USE IN OPHTHALMIC PROCEDURES”, filed January 5, 2024, the entire disclosure of which is incorporated by reference herein, may temporarily attaches to the operative eye in order to fix the eye relative to robotic arms 138/140 and visualization subsystem 114. [0176] An automated tool exchange system 124, further described in International Patent Application No. PCT/US2024/010569 entitled “AUTONOMOUS TOOL EXCHANGE SYSTEM FOR AUTOMATED AND SEMI-AUTOMATED INTRAOCULAR SURGICAL PROCEDURES”, filed January 5, 2024, and International Patent Application No. PCT/US2024/010586 entitled “SURGICAL TOOL HOLDER FOR INTRAOCULAR ROBOTIC SURGICAL SYSTEMS”, filed January 5, 2024; the entire disclosures of which are incorporated by reference herein, may hold surgical tools 126 required for an ophthalmic procedure. In certain examples, at any point during or surrounding a procedure, central command computer 128 may request a tool exchange to swap out one or more of the surgical tools 126 within the tool exchange system 124. As a result of such a request, the real-time control computer 132 may send a command to reposition the robotic arms 138/140 and to perform a predetermined motion that deposits the currently held surgical tool from among surgical tools 126 onto the tool exchange system 124. A similar request may then to sent to acquire the requested surgical tool from among the surgical tools 126. [0177] According to certain examples, there may also be a patient-side computer 146 configured to provide a display of key surgical information to the surgical staff such as nurses 199 or other users, for example via a graphical user interface (GUI) 148. Outputs from patient-side computer
SG Docket No.: 14843-705.600 146 may be sent to central control computer 149. In certain examples, patient-side computer 146 may be facing or adjacent to the operative eye of the patient 145. [0178] There may also be a surgeon-side cockpit 101 which may include a cockpit computer 103 having a surgeon graphical user interface (GUI) 104, a path approval 107, a teleoperation software 109, and a display 111. Surgeon-side cockpit 101 may face or be used by the surgeon 150. Cockpit computer 103 may manage communication between surgeon-side cockpit 101 and patient-side surgical cart 102, as well as hosting teleoperation software 109 and providing a display (surgeon graphical user interface (GUI) 105) to the surgeon 150 seated at the surgeon- side cockpit 101. In certain examples, the surgeon-side cockpit 101 may be known as a surgical controller or Polaris System™. Surgeon graphical user interface (GUI) 105 may include a path approval interface 107 for a surgeon to approve or modify a trajectory path of the robotic surgical system. There may also be a telemanipulator with foot pedal 113, which the surgeon 150 may use similar to a joystick to control robotic arms 138/140. In certain examples, a phacoemulsification control pedal 115 may communicate with and send commands 150C to phacoemulsification system 136. [0179] FIG.2 illustrates an exemplary structure and layout of a surgeon-side cockpit 203 of the robotic surgical system. Shown here is a layout and communication between subsystems of the surgeon-side cockpit 203, which may be separated from surgical cart 201 and patient-side surgical cart 202, and interact with these and also components within the surgeon-side cockpit 203 via tool position control commands 221, tool-specific functionality commands 222, and visual feedback 223. Two telemanipulators 210/211 may be used by the surgeon similar to joysticks to control motion of the robotic arms of the patient-side surgical cart 202. This may be accomplished via tool position control commands 221 from telemanipulators 210/211 sending data to CPU 206, which may interface and send commands to monitor 208 via visual feedback 223. CPU 206 may also receive visual feedback 223 from patient-side surgical cart 202. Telemanipulators 210/211 and also have associated grippers 212/213, respectively. Telemanipulators 210/211 may interact with CPU 206 via tool position control commands 221, and CPU 206 may interact with patient-side surgical cart 202 via tool position control commands 221. Also shown is CPU 206, which according to certain embodiments may be a cockpit computer 203 similar to or the same as cockpit computer 103 shown in FIG.1. Foot pedal 216 may be controlled via a touchscreen GUI 218 to send tool-specific functionality commands 222 to CPU 206,which may also receive inputs from third party phacoemulsification pedal 214. A chair 220 for a surgeon to sit in may also be a part of surgeon-side cockpit 203. [0180] FIG.3 illustrates exemplary modules for operation of the robotic surgical system during ophthalmic procedures 300. For example, shown is a control flow for the robotic surgical system
SG Docket No.: 14843-705.600 which may be in assistive mode. Eye 324 may be imaged via an imaging or visualization subsystem 326 which may include an OCT 328 and DM 330, which may have red reflect 332 and external illumination 334. Imaging 326 may output data to AI 336, which may interface with an OCT targeter 338 to focus the range of or otherwise manipulate 300B OCT 328. AI 336 may output or interface with trajectory planner 320, which may receive inputs from a surgical plan 308, and surgeon-side cockpit 302 including surgeon 304 interfacing with GUI 306. GUI 306 may also manipulate trajectory planner 310. Trajectory planner 310 may actuate or interface with micromanipulator 314 (which may also be affected by micromanipulator 312). Micromanipulator 314 may actuate surgical instrument 318, which may also received inputs from phacoemulsification system 316. Surgical instrument 318 may then operate on the operative eye 324, which may also interface with a docking system 320. Docking system 320 may be retractable. In certain embodiments, a suction feature may operate separately or in conjunctions with hydration 322. In certain examples, hydration 322 may also be provided to the operative eye 324. In yet other examples, docking system 320 may provide hydration 322 to the operative eye via an integrated hydration nozzle 300A, as further described in in International Patent Application No. PCT/US2024/010589 entitled “DOCKING STATION FOR USE IN OPHTHALMIC PROCEDURES”, filed January 5, 2024, the entire disclosure of which is incorporated by reference herein. Surgical instrument 318 may be coupled with the tool exchange system. [0181] OCT targeter 338 may implement an OCT targeting algorithm 300B to determine the desired imaging region of the operative eye, via OCT device 328. This may be done, for example, via visualization and path planning computer 108 from FIG.1. In certain examples, trajectory planner 310 may be coupled to or have integrated AI 336. AI 336 may be configured to identify and create models for anatomical features of the operative eye from imaging 326 and may be configured to create and label low-dimensional representations of anatomical features via feature extraction. For example, DM 440 and/or OCT 328 may be integrated with AI 336 to detect a location of a tool tip. For example, a low-dimensional representation of the pupil may be a circle, and extracted data such as a radius or center of the pupil may be sent by AI 336 to trajectory planner 310. [0182] FIG.4A illustrates an exemplary control flow of the robotic surgical system in assistive mode 400. Surgical robot 410 may include or be coupled to a micromanipulator as well as a plurality of surgical tools, which may be stored and managed via a tool exchanger system. In certain examples, pre-operative eye scan 402 may be performed prior to beginning a surgical procedure to extract patient-specific metrics 404 and forwarding patient-specific metrics 401 to trajectory planner 406, which may be based on acquiring image data 411 of the operative eye and
SG Docket No.: 14843-705.600 surgical tool from visualization subsystem (OCT 414/DM 416), which may also receive anatomical landmarks 413. Patient-specific metrics 401 may include patient eye size, pupil shape, anatomical anomalies, etc. These metrics may be used throughout the surgery as a set of environmental conditions to modify the output of the trajectory planner. Trajectory planner 406 may output a trajectory 407, which may be overridden 403 by the surgeon in a surgeon takeover (teleoperative mode) 403. The surgeon may also modify 405 the trajectory planner 406. Trajectory 407 may be outputted to phacoemulsification system 408 to actuate or interface with surgical robot 410, which operates on the eye 412 via a tool-issue interaction 409. [0183] FIG.4B illustrates an exemplary control flow of the robotic surgical system in teleoperative mode 425. Surgical robot 410 may include or be coupled to a micromanipulator as well as a plurality of surgical tools, which may be stored and managed via a tool exchanger system. Shown here is a digital microscope 435 which may be controlled by surgeon 441 operating an imaging foot pedal 437. Digital microscope 435 may be coupled to monitor 439 for surgeon 441 to view and command surgical robot 410 to operate on eye 412. Surgeon 441 may also operate phacoemulsification system 429 via phacoemulsification foot pedal 427 which interfaces with or actuates surgical robot 410. For further description, see “Teleoperative Mode”, “Position Control of Surgical Instruments”, “Clutching”, and “Surgical GUI Software” sections above. [0184] FIG.4C illustrates exemplary high-level states of a control flow for the robotic surgical system in assistive mode. Assistive mode may be used for one or more steps in a surgical procedure. In certain examples, steps of a surgical procedure include planning the surgical step 452, execution of the surgical step 454, and assessment of the surgical step 456. In certain examples, assessment 456 may include scanning including instructions to robotic surgical system to reassess the multimodal visualization subsystem and collecting and supplying imaging data to path/trajectory planner. In certain examples, various steps 452/454/456 or sub-steps within a step may occur simultaneously, and at the same or different operating frequencies. For example, assessment 456, which may include rescanning via visualization subsystem and may occur at a rate of 50Hz, while collecting and supplying imaging data to the trajectory planner for execution 454 of the robotic arms may occur at a rate of up to 100Hz. In certain example, some steps 452/454/456 or sub-steps may be omitted or occur independently. For further details, see “Design Details” section above. [0185] FIG.5 illustrates an exemplary process flow for execution of the robotic surgical system in teleoperative mode. The flow may being with the surgeon moving a telemanipulator 502, and telemanipulator steps 501 including raw encoder data 504, filtered encoder data 506, raw end effector pose 508, filtered end effector pose 510. This may be followed by teleoperation mapping
SG Docket No.: 14843-705.600 steps 503 of filtered end effector pose of telemanipulation 512, scaled filered end effector pose 514, desired end effector pose of micro manipulator in camera frame 516, desired end effector pose of micromanipulator 518. Next are micromanipulator steps 505 including desired end effector pose of micromanipulator 520, desired joint positions of micromanipulator 522, desired motor positions of micromanipulator 524, and commanding motors of micromanipulator 526. Finally there is the step of the micromanipulator moving in a way that appears intuitive to the surgeon 528. For further description, see “Teleoperative Mode”, “Position Control of Surgical Instruments”, “Clutching”, and “Surgical GUI Software” sections above. [0186] FIG.6 illustrates exemplary robotic manipulators and a visualization system of the robot surgical system. As show here, there are macro arms 602, micromanipulators 604, an imaging system 614, which according to certain examples may be visualization subsystem 114 from FIG. 1. Also shown is tool exchange system 624. Motion platform 610 is also shown, which may be configured to stabilize robotic manipulators 602/604 and imaging system 606. In certain examples, the tool exchange system 608 and micromanipulators 604 may be coupled to a surgical controller. [0187] FIG.7 illustrates an exemplary software architecture for the robotic surgical system including a behavior tree layer 704. In certain examples, the exemplary software architecture may be implemented in central control computer 128 of FIG.1. Behavior tree layer 704 may act as a middle layer between high level inputs 720 and low level inputs 722. High level inputs 720 may include inputs from a surgeon in teleoperative mode (for example via surgeon GUI 721) and from assistance mode inputs, for example via visualization and path planning computer 108 from FIG.1. Another high level input 720 may include human-robot interaction GUI 702 which may include paused 7123, idle 716, and running 714 states with options to toggle between states or maintain a state, for example by pausing 715 running state 714 to a paused state 712, by cancelling 711 running state 714 to idle state 716, by finishing 719 running state 714 to idle state 716, by starting 718 idle state 716 to running state 714, by resuming 713 paused state 7132 to running state 714, by cancelling 717 paused state 712 to idle state 716, or by maintaining 799 idle state 716. Low level inputs may include a communication layer 706 that includes robot operation system ROS (middleware) 708, as well as physical layer subsystems 710 including docking 724, motion stage 726, OCT probe/device 728, and all hardware 730. Behavior tree layer 704 may define the logical structure and decision-making process of the robotic surgical system. In certain implementations, the decision-making process may be designed to emulate the standard procedures used in manual cataract surgery. [0188] FIG.8 illustrates exemplary interfaces between the robotic surgical system and a phacoemulsification system. In certain examples, the phacoemulsification system may include
SG Docket No.: 14843-705.600 foot pedal 814, console 702, and phacoemulsification instruments 806, and CPU 816 may be part of the robotic surgical system. In this or other examples, exemplary interfaces may be between the surgeon-side cockpit 801 of robotic surgical system 100 from FIG.1 and a console 802 of a phacoemulsification system. Interfaces may include software, electrical and fluidic interfaces. For example, shown is a communication cable 812 from a foot pedal 814 of the surgeon-side cockpit 801 to the console 820 of the phacoemulsification system. Also shown are communication lines 810 from the console 820 of the phacoemulsification system to a CPU 816 of the surgeon-side cockpit 801 of the robotic surgical system. In certain examples, CPU 816 may also include a planner engine, which may be trajectory planner such as path planner 112 from FIG.1. Also shown are fluid and communication lines 804 between console 802 of the phacoemulsification system and phacoemulsification instruments 806 operating on a patient 808. [0189] FIG.9 illustrates a flow chart showing an exemplary method 900 for a method of real- time 3D automated monitoring of an operative eye. [0190] Method 900 begins at block 905 with acquiring data of the operative eye from a multimodal visualization subsystem including at least one optical coherence tomography (OCT) device input and one digital microscope input. [0191] Method 900 continues at block 910 with extracting, via an artificial intelligence (AI) subsystem, patient-specific parameters from the acquired data, in which the extracting includes: overlaying digital anatomical landmarks on the acquired data of the operative eye, and determining a surgical incision site on the operative eye. [0192] Method 900 continues at block 915 with determining, via a trajectory planner, a trajectory path through the surgical incision site on the operative eye for a selected one of a plurality of surgical tools. [0193] Method 900 concludes at block 920 with modifying the trajectory path in real-time via the trajectory planner based on re-acquiring the data of the operative eye from the multimodal visualization system and re-extracting the patient-specific parameters from the acquired data. [0194] According to certain embodiments of method 900, the OCT device scans a depth of up to ~18mm on the operative eye. [0195] According to certain embodiments of method 900, the visualization field of volume for the OCT device extends from a first depth to a second depth. [0196] According to certain embodiments of method 900, the first depth is a top of a cornea of the operative eye, wherein the second depth is a posterior capsule of the operative eye. [0197] According to certain embodiments of method 900, the trajectory planner reassesses the trajectory path via one or of the digital microscope input and the OCT device input of the multimodal visualization subsystem.
SG Docket No.: 14843-705.600 [0198] According to certain embodiments of method 900, the trajectory planner reassesses the trajectory path at a speed of up to 20Hz. [0199] According to certain embodiments of method 900, the multimodal visualization subsystem collects and supplies imaging data to the trajectory planner. [0200] According to certain embodiments of method 900, the multimodal visualization subsystem collects and supplies imaging data to the trajectory planner at speeds up to 100Hz. [0201] According to certain embodiments of method 900, the trajectory planner reassesses the trajectory path via one or of the digital microscope input and the OCT device input of the multimodal visualization subsystem and the multimodal visualization subsystem collects and supplies imaging data to the trajectory planner at a same operating frequency. [0202] According to certain embodiments of method 900, the trajectory planner reassesses the trajectory path via one or of the digital microscope input and the OCT device input of the multimodal visualization subsystem and the multimodal visualization subsystem collects and supplies imaging data to the trajectory planner at a different operating frequency. [0203] According to certain embodiments of method 900, patient-specific parameters include operative eye size, pupil shape, anatomical anomalies, and pathology. [0204] According to certain embodiments of method 900, extracting the patient-specific parameters from the acquired data further includes imaging the selected one of the plurality of surgical tools. [0205] According to certain embodiments of method 900, extracting the patient-specific parameters from the acquired data further includes determining a location of a tip of the selected one of the plurality of surgical tools in a visualization field of volume. [0206] According to certain embodiments of method 900, the trajectory path incorporates further surgical procedures including phacoemulsification and polishing of the operative eye. [0207] FIG.10 illustrates a flow chart showing an exemplary method 1000 for automated control of an ophthalmic robotic surgical system. [0208] Method 1000 begins at block 1005 with receiving, at a behavior tree layer of the surgical system, inputs from a human-robot interaction GUI for executing actions along steps of a determined trajectory path for a surgical instrument operating on the operative eye. [0209] Method 1000 continues at block 1010 with determining a current step in the determined trajectory path. [0210] Method 1000 continues at block 1015 with determining a next correct step in the determined trajectory path.
SG Docket No.: 14843-705.600 [0211] Method 1000 continues at block 1020 with determining a location of a tip of the surgical instrument relative to landmarks in a visualization field of volume using an OCT device of the surgical system. [0212] Method 1000 concludes at block 1025 with executing, via a communication layer, commands to a physical layer of the surgical system to achieve the next correct step in the determined trajectory pathway. [0213] According to certain embodiments of method 1000, inputs from the human-robot interaction GUI include commands to pause, resume, cancel, start, or finish the current step in the determined trajectory pathway, in which the current step is one or more of a paused, running, or idle state. [0214] According to certain embodiments of method 1000, the physical layer includes a control PC engine actuating one or more of: a docking system, motion stage, probe of the OCT device, and hardware. [0215] According to certain embodiments of method 1000, steps of the determined trajectory pathway emulate standard procedures used in manual cataract surgery. [0216] According to certain embodiments of method 1000, the inputs include inputs from a user- surgeon when the surgical system is operating in a teleoperative mode. [0217] According to certain embodiments of method 1000, the inputs include inputs from a planning PC engine when the surgical system is operating in an assistive mode. [0218] It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein and may be used to achieve the benefits described herein. [0219] The process parameters and sequence of steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various example methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed. [0220] The exemplary use of a phacoemulsification system in FIGs.1-5 and 7-10 is provided by way of illustration and not limitation. Additionally or optionally, the specific steps detailed in FIGs.1-5 and 7-10 may be modified, re-ordered or removed as needed depending upon the specific surgical tool being used, the specific procedure or surgical step being performed by the surgical robotic system. In certain implementations in the use of a selected surgical instrument, anatomical landmarks and imaging outputs from the multi-modal imaging system may also be
SG Docket No.: 14843-705.600 modified and utilized in alternative ways in furtherance of robotic system control modes suited to the selected surgical instrument. Still further, in particular implementations, it is to be appreciated that specific aspects of FIGs.5, 7 and 8 will be adapted and modified based on the specific surgical instrument selected along with robotic system adaptation for any support or accessories used with the selected surgical instrument to result in appropriate modifications to, for example, a specific implementation of a Behavior Tree 704, communications and controls as in FIG.8 or variations to FIGs.2 and 4B. [0221] Any of the methods (including user interfaces) described herein may be implemented as software, hardware or firmware, and may be described as a non-transitory computer-readable storage medium storing a set of instructions capable of being executed by a processor (e.g., computer, tablet, smartphone, etc.), that when executed by the processor causes the processor to control perform any of the steps, including but not limited to: displaying, communicating with the user, analyzing, modifying parameters (including timing, frequency, intensity, etc.), determining, alerting, or the like. For example, any of the methods described herein may be performed, at least in part, by an apparatus including one or more processors having a memory storing a non-transitory computer-readable storage medium storing a set of instructions for the processes(s) of the method. [0222] While various embodiments have been described and/or illustrated herein in the context of fully functional computing systems, one or more of these example embodiments may be distributed as a program product in a variety of forms, regardless of the particular type of computer-readable media used to actually carry out the distribution. The embodiments disclosed herein may also be implemented using software modules that perform certain tasks. These software modules may include script, batch, or other executable files that may be stored on a computer-readable storage medium or in a computing system. In some embodiments, these software modules may configure a computing system to perform one or more of the example embodiments disclosed herein. [0223] As described herein, the computing devices and systems described and/or illustrated herein broadly represent any type or form of computing device or system capable of executing computer-readable instructions, such as those contained within the modules described herein. In their most basic configuration, these computing device(s) may each comprise at least one memory device and at least one physical processor. [0224] The term “memory” or “memory device,” as used herein, generally represents any type or form of volatile or non-volatile storage device or medium capable of storing data and/or computer-readable instructions. In one example, a memory device may store, load, and/or maintain one or more of the modules described herein. Examples of memory devices comprise,
SG Docket No.: 14843-705.600 without limitation, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, Hard Disk Drives (HDDs), Solid-State Drives (SSDs), optical disk drives, caches, variations or combinations of one or more of the same, or any other suitable storage memory. [0225] In addition, the term “processor” or “physical processor,” as used herein, generally refers to any type or form of hardware-implemented processing unit capable of interpreting and/or executing computer-readable instructions. In one example, a physical processor may access and/or modify one or more modules stored in the above-described memory device. Examples of physical processors comprise, without limitation, microprocessors, microcontrollers, Central Processing Units (CPUs), Field-Programmable Gate Arrays (FPGAs) that implement softcore processors, Application-Specific Integrated Circuits (ASICs), portions of one or more of the same, variations or combinations of one or more of the same, or any other suitable physical processor. [0226] Although illustrated as separate elements, the method steps described and/or illustrated herein may represent portions of a single application. In addition, in some embodiments one or more of these steps may represent or correspond to one or more software applications or programs that, when executed by a computing device, may cause the computing device to perform one or more tasks, such as the method step. [0227] In addition, one or more of the devices described herein may transform data, physical devices, and/or representations of physical devices from one form to another. Additionally or alternatively, one or more of the modules recited herein may transform a processor, volatile memory, non-volatile memory, and/or any other portion of a physical computing device from one form of computing device to another form of computing device by executing on the computing device, storing data on the computing device, and/or otherwise interacting with the computing device. [0228] The term “computer-readable medium,” as used herein, generally refers to any form of device, carrier, or medium capable of storing or carrying computer-readable instructions. Examples of computer-readable media comprise, without limitation, transmission-type media, such as carrier waves, and non-transitory-type media, such as magnetic-storage media (e.g., hard disk drives, tape drives, and floppy disks), optical-storage media (e.g., Compact Disks (CDs), Digital Video Disks (DVDs), and BLU-RAY disks), electronic-storage media (e.g., solid-state drives and flash media), and other distribution systems. [0229] A person of ordinary skill in the art will recognize that any process or method disclosed herein can be modified in many ways. The process parameters and sequence of the steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or
SG Docket No.: 14843-705.600 discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. [0230] The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or comprise additional steps in addition to those disclosed. Further, a step of any method as disclosed herein can be combined with any one or more steps of any other method as disclosed herein. [0231] The processor as described herein can be configured to perform one or more steps of any method disclosed herein. Alternatively or in combination, the processor can be configured to combine one or more steps of one or more methods as disclosed herein. [0232] When a feature or element is herein referred to as being "on" another feature or element, it can be directly on the other feature or element or intervening features and/or elements may also be present. In contrast, when a feature or element is referred to as being "directly on" another feature or element, there are no intervening features or elements present. It will also be understood that, when a feature or element is referred to as being "connected", "attached" or "coupled" to another feature or element, it can be directly connected, attached or coupled to the other feature or element or intervening features or elements may be present. In contrast, when a feature or element is referred to as being "directly connected", "directly attached" or "directly coupled" to another feature or element, there are no intervening features or elements present. Although described or shown with respect to one embodiment, the features and elements so described or shown can apply to other embodiments. It will also be appreciated by those of skill in the art that references to a structure or feature that is disposed "adjacent" another feature may have portions that overlap or underlie the adjacent feature. [0233] Terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. For example, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items and may be abbreviated as "/". [0234] Spatially relative terms, such as "under", "below", "lower", "over", "upper" and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or
SG Docket No.: 14843-705.600 operation in addition to the orientation depicted in the figures. For example, if a device in the figures is inverted, elements described as "under" or "beneath" other elements or features would then be oriented "over" the other elements or features. Thus, the exemplary term "under" can encompass both an orientation of over and under. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. Similarly, the terms "upwardly", "downwardly", "vertical", "horizontal" and the like are used herein for the purpose of explanation only unless specifically indicated otherwise. [0235] Although the terms “first” and “second” may be used herein to describe various features/elements (including steps), these features/elements should not be limited by these terms, unless the context indicates otherwise. These terms may be used to distinguish one feature/element from another feature/element. Thus, a first feature/element discussed below could be termed a second feature/element, and similarly, a second feature/element discussed below could be termed a first feature/element without departing from the teachings of the present invention. [0236] Throughout this specification and the claims which follow, unless the context requires otherwise, the word “comprise”, and variations such as “comprises” and “comprising” means various components can be co-jointly employed in the methods and articles (e.g., compositions and apparatuses including device and methods). For example, the term “comprising” will be understood to imply the inclusion of any stated elements or steps but not the exclusion of any other elements or steps. [0237] In general, any of the apparatuses and methods described herein should be understood to be inclusive, but all or a sub-set of the components and/or steps may alternatively be exclusive, and may be expressed as “consisting of” or alternatively “consisting essentially of” the various components, steps, sub-components or sub-steps. [0238] As used herein in the specification and claims, including as used in the examples and unless otherwise expressly specified, all numbers may be read as if prefaced by the word "about" or “approximately,” even if the term does not expressly appear. The phrase “about” or “approximately” may be used when describing magnitude and/or position to indicate that the value and/or position described is within a reasonable expected range of values and/or positions. For example, a numeric value may have a value that is +/- 0.1% of the stated value (or range of values), +/- 1% of the stated value (or range of values), +/- 2% of the stated value (or range of values), +/- 5% of the stated value (or range of values), +/- 10% of the stated value (or range of values), etc. Any numerical values given herein should also be understood to include about or approximately that value, unless the context indicates otherwise. For example, if the value "10" is disclosed, then "about 10" is also disclosed. Any numerical range recited herein is intended to
SG Docket No.: 14843-705.600 include all sub-ranges subsumed therein. It is also understood that when a value is disclosed that "less than or equal to" the value, "greater than or equal to the value" and possible ranges between values are also disclosed, as appropriately understood by the skilled artisan. For example, if the value "X" is disclosed the "less than or equal to X" as well as "greater than or equal to X" (e.g., where X is a numerical value) is also disclosed. It is also understood that the throughout the application, data is provided in a number of different formats, and that this data, represents endpoints and starting points, and ranges for any combination of the data points. For example, if a particular data point “10” and a particular data point “15” are disclosed, it is understood that greater than, greater than or equal to, less than, less than or equal to, and equal to 10 and 15 are considered disclosed as well as between 10 and 15. It is also understood that each unit between two particular units are also disclosed. For example, if 10 and 15 are disclosed, then 11, 12, 13, and 14 are also disclosed. [0239] Although various illustrative embodiments are described above, any of a number of changes may be made to various embodiments without departing from the scope of the invention as described by the claims. For example, the order in which various described method steps are performed may often be changed in alternative embodiments, and in other alternative embodiments one or more method steps may be skipped altogether. Optional features of various device and system embodiments may be included in some embodiments and not in others. Therefore, the foregoing description is provided primarily for exemplary purposes and should not be interpreted to limit the scope of the invention as it is set forth in the claims. [0240] The examples and illustrations included herein show, by way of illustration and not of limitation, specific embodiments in which the subject matter may be practiced. As mentioned, other embodiments may be utilized and derived there from, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Such embodiments of the inventive subject matter may be referred to herein individually or collectively by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept, if more than one is, in fact, disclosed. Thus, although specific embodiments have been illustrated and described herein, any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.