WO2025072724A1 - Wide-angle field-of-view fluorescent scene imager - Google Patents
Wide-angle field-of-view fluorescent scene imager Download PDFInfo
- Publication number
- WO2025072724A1 WO2025072724A1 PCT/US2024/048930 US2024048930W WO2025072724A1 WO 2025072724 A1 WO2025072724 A1 WO 2025072724A1 US 2024048930 W US2024048930 W US 2024048930W WO 2025072724 A1 WO2025072724 A1 WO 2025072724A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- optical filter
- radial
- wavelength
- blocking
- radial zone
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00186—Optical arrangements with imaging filters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/20—Filters
- G02B5/208—Filters for use with infrared or ultraviolet radiation, e.g. for separating visible light from infrared and/or ultraviolet radiation
Definitions
- a process of fluorescence imaging can use a source of light of an excitation wavelength to illuminate a subject and stimulate an emission of light of another wavelength to make an image of features in a scene that are not easily visible under normal illumination.
- the subject can be an unaltered specimen, or one that has been dyed with special fluorescent dye(s) to highlight specific features of the specimen.
- fluorescent based imaging provides surgeons visualization of anatomy and tissue activity not visible through normal visualization.
- ICG indocyanine green
- One common form of fluorescence imaging used in surgery uses indocyanine green (ICG) dye which is injected into a patient’s bloodstream to image anatomical features and conditions such as tissue perfusion and blood flow. Multiple other dyes and autofluorescent capabilities allow potential different visualizations behaviors that aid surgeons in targeting the correct tissue to dissect or avoid.
- ICG indocyanine green
- a fluorescence imaging camera can be used for fluorescence imaging.
- the fluorescence imaging camera selectively blocks much stronger excitation light wavelength (background) that reflects back from the subject and transmits the weaker emission wavelength (signal) from the subject to create a useful image with a high signal -to- background ratio.
- Conventional filters can perform a selective blocking function to separate the emission and excitation.
- filters do not ensure that the selective blocking function is maintained appropriately across a wide-angle field of view of the fluorescence imaging camera due to their limited incident angle ranges. That is, in a wide-angle field of view more light at the excitation wavelength outside the limited incident angle range can pass through the filters. This results in poor performance, such as reducing a signal-to-background ratio, obscuring a desired image, and the like.
- the present disclosure provides a multiple zone spectral optical filter (also referred to as “optical filter”).
- the optical filter can include a first radial zone to receive a first plurality of light beams having a first incident angle range.
- the first zone can include one or more blocking wavelength bands and one or more transmission wavelength bands.
- the optical filter can further include a second radial zone to receive a second plurality of light beams having a second incident angle range greater than the first incident angle range.
- the second radial zone can maintain one or more excitation wavelength blocking characteristics within the one or more blocking wavelength bands and one or more emission wavelength transmittance characteristics within the one or more transmission wavelength bands across the first and second zones.
- the second radial zone can have the same excitation wavelength blocking characteristics and the emission wavelength transmittance characteristics as the first zone.
- a change in the excitation wavelength blocking characteristics and/or the emission wavelength transmittance characteristics between the first zone and the second zone can satisfy a change threshold describing a value or a value range indicative of a difference among different radial zones being less than a value or falling within a value range.
- the optical filter can include a third radial zone to receive a third plurality of light beams having a third incident angle range greater than the second incident angle range, wherein the third radial zone maintains the one or more excitation wavelength transmittance characteristics and the one or more emission wavelength transmittance characteristics across the first, and second radial zones.
- the optical filter can further include a fourth radial zone to receive a fourth plurality of light beams having a fourth incident angle range greater than the third incident angle range, wherein the fourth radial zone maintains the one or more excitation wavelength transmittance characteristics and the one or more emission wavelength transmittance characteristics across the first, second, and third radial zones.
- a multiple zone spectral optical filter as taught herein can be employed as part of a camera assembly.
- the present disclosure provides a camera assembly for a wide-angle field-of-view imaging.
- the camera assembly can include a lens assembly and an optical filter as taught herein.
- the optical filter can receive light having wide incident angles transmitted by the lens assembly.
- the camera assembly can further include one or more image sensors having a wide-angle field-of-view and configured to capture light in one or more selected wavelength bands transmitted from the optical filter.
- a multiple zone spectral optical filter as taught herein can be employed as part of a laparoscope or surgical robotic system.
- the present disclosure provides a robotic surgical system.
- the robotic surgical system can include a light source and a camera assembly having a multiple zone spectral optical filter as taught herein.
- the robotic surgical system can perform multispectral imaging.
- multispectral illumination with excitation light in multiple wavelength ranges e.g., in the visible, near-infrared, and/or infrared spectra
- detection of emission light in multiple wavelength ranges e.g., in the visible, near-infrared, and/or infrared spectra
- the camera assembly can be a multispectral camera assembly that enables simultaneous imaging of non-visible light (e.g., near-infrared and/or infrared fluorescence) and visible light (e.g., visible light fluorescence) of an internal body space.
- FIG. l is a diagram illustrating an example surgical robotic system in accordance with some embodiments.
- FIG. 2B is an example perspective view of an example operator console of a surgical robotic system of the present disclosure in accordance with some embodiments.
- FIG. 3 A is a diagram illustrating an example side view of a surgical robotic system performing a surgery within an internal cavity of a subject in accordance with some embodiments.
- FIG. 3B is a diagram illustrating an example top view of the surgical robotic system performing the surgery within the internal cavity of the subject of FIG. 3 A in accordance with some embodiments.
- FIG. 4A is an example perspective view of a single robotic arm subsystem in accordance with some embodiments.
- FIG. 4B is an example perspective side view of a single robotic arm of the single robotic arm subsystem of FIG. 4A in accordance with some embodiments.
- FIG. 5 is an example perspective front view of a camera assembly and a robotic arm assembly in accordance with some embodiments.
- FIG. 6 illustrates transmission spectra of an optical filter with spatially uniform characteristics for incident angles of 0 and 30 degrees, respectively.
- FIG. 8B illustrates an optical filter having two discrete radial zones in accordance with some embodiments.
- FIG. 8C illustrates an optical filter having continuously-varying radial zones in accordance with some embodiments.
- FIG. 9 illustrates a fluorescence imaging system using the optical filter illustrated in FIG. 8A in accordance with some embodiments.
- Fluorescence can help visualize blood vessels, ureters, cancer, nerves, tissue perfusion, amongst other tissue types and anatomical features. All types of fluorescence like dye, autofluorescence, and other types of differential visualization may be paired with a fluorescence imaging system.
- the fluorescence imaging system can employ filters on a camera assembly that selectively block specific frequencies of emitted light.
- Such imaging system can use wide-angle lenses that can image large areas by collecting light at large angles, for example, between about -60 degrees or more to between +60 degrees or more. The light rays travel through the wide-angle lenses at large angles and also impinge upon an image sensor at large angles.
- multilayer filters to perform a selective blocking function to separate emission wavelengths and excitation wavelengths.
- multilayer filters can be sensitive to incident angles. Often, they can block excitation wavelengths at a limited incident angle range, such as between about -20 degrees and about +20 degrees. Outside of the limited incident angle range, the multilayer filters can allow more light at the excitation wavelength to pass through, reducing a signal-to-background ratio, and obscuring a desired image.
- the incident angle can be increased a little, but at the cost of multilayer complexity and use of more exotic materials and higher processing costs, which may not be suitable for surgical devices (e.g., surgical endoscopes, or the like).
- conventional optical filters can have spatially uniform characteristics (e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like are spatially uniform across an entire optical fitter) such that excitation wavelength blocking characteristics and emission wavelength transmittance characteristics vary for different incident angles. For example, as incident angles increase, the filter characteristics of conventional optical filters change with features moving to shorter or longer wavelengths. The blocking band for the excitation wavelength starts to transmit more, and the transmission band for the emission wavelength starts to transmit less. The signal to background ratio decreases, compromising the quality of the fluorescence image.
- spatially uniform characteristics e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like are spatially uniform across an entire optical fitter
- the present disclosure provides an optical filter allowing fluorescence imaging over a wide field of view with compact and inexpensive optics needed for surgical endoscopy.
- the optical filter as taught herein can include multiple radial zones to receive light beams incident on the zones over broad incident angle ranges (e.g., between about -35 degrees or more and about +35 degrees or more, between about -60 degrees or more and about +60 degrees or more, or the like).
- characteristics e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like
- FIGS. 8A and 8B examples are described with respect to FIGS. 8A and 8B.
- the characteristics for each radial zone can be different, but constant within the same radial zone.
- transmission and blocking bands move with the incident angles as described above where a blocking band for the excitation wavelength starts to transmit more, and a transmission band for the emission wavelength starts to transmit less, but characteristics of each radial zone can be adjusted so that movements of transmission and blocking bands can be tolerable for a corresponding wavelength range that each radial zone is designed for, thereby maintaining excitation wavelength blocking characteristics and emission wavelength transmittance characteristics across multiple radial zones over a broad incident angle range (e.g., between about -35 degrees or more and about +35 degrees or more, between about -60 degrees or more and about +60 degrees or more, or the like).
- controller can refer to a hardware device that includes a memory and a processor and is specifically programmed to execute the processes described herein in accordance with some embodiments.
- the memory is configured to store the modules and the processor is specifically configured to execute said modules to perform one or more processes which are described further below.
- multiple different controllers or controllers or multiple different types of controllers or controllers can be employed in performing one or more processes.
- different controllers or controllers can be implemented in different portions of a surgical robotic systems.
- the surgical robotic module can include multiple different submodules or parts that can be inserted into the trocar separately.
- the surgical robotic module, surgical robotic module or robotic assembly can include multiple separate robotic arms that are deployable within the patient along different or separate axes. These multiple separate robotic arms can be collectively referred to as a robotic arm assembly herein.
- a surgical camera assembly can also be deployed along a separate axis.
- the surgical robotic module, surgical robotic module, or robotic assembly can also include the surgical camera assembly.
- the surgical robotic module, or robotic assembly employs multiple different components, such as a pair of robotic arms and a surgical or robotic camera assembly, each of which are deployable along different axes and are separately manipulatable, maneuverable, and movable.
- the robot support system can be directly mounted to a surgical table or to the floor or ceiling within an operating room. In another embodiment, the mounting is achieved by various fastening means, including but not limited to, clamps, screws, or a combination thereof. In other embodiments, the structure can be free standing.
- the robot support system can mount a motor assembly that is coupled to the surgical robotic module, which includes the robotic arm assembly and the camera assembly.
- the motor assembly can include gears, motors, drivetrains, electronics, and the like, for powering the components of the surgical robotic module.
- the robotic arm assembly and the camera assembly are capable of multiple degrees of freedom of movement.
- the sensors of the sensing and tracking module 16 can be coupled to a region of the hand and/or the arm, such as the fingers, the wrist region, the elbow region, and/or the shoulder region. Additional sensors can also be coupled to a head and/or neck region of the operator in some embodiments.
- the sensing and tracking module 16 can be external and coupled to the hand controllers 17 via electricity components and/or mounting hardware.
- the optional sensor and tracking module 16A can sense and track movement of one or more of an operator’s head, of at least a portion of an operator’s head, an operator’s eyes or an operator’s neck based, at least in part, on imaging of the operator in addition to or instead of by a sensor or sensors attached to the operator’s body.
- the trocar 50 is a medical device that can be made up of an awl (which can be a metal or plastic sharpened or non-bladed tip), a cannula (essentially a hollow tube), and a seal in some embodiments.
- the trocar 50 can be used to place at least a portion of the robotic subsystem 20 in an interior cavity of a subject (e.g., a patient) and can withdraw gas and/or fluid from a body cavity.
- the robotic subsystem 20 can be inserted through the trocar 50 to access and perform an operation in vivo in a body cavity of a patient.
- the robotic subsystem 20 can be supported, at least in part, by the trocar 50 or a trocar mount with multiple degrees of freedom such that the robotic arm assembly 42 and the camera assembly 44 can be maneuvered within the patient into a single position or multiple different positions.
- the robotic arm assembly 42 and camera assembly 44 can be moved with respect to the trocar 50 or a trocar mount with multiple different degrees of freedom such that the robotic arm assembly 42 and the camera assembly 44 can be maneuvered within the patient into a single position or multiple different positions.
- the RSS 46 can further include an optional controller for processing input data from one or more of the system components (e.g., the display 12, the sensing and tracking module 16, the robotic arm assembly 42, the camera assembly 44, and the like), and for generating control signals in response thereto.
- the motor 40 can also include a storage element for storing data in some embodiments.
- the robotic arm assembly 42 can also have associated therewith end regions that can terminate in end-effectors that follow the movement of one or more fingers of the operator in some embodiments, such as for example the index finger as the user pinches together the index finger and thumb.
- the robotic arm assembly 42 can follow movement of the arms of the operator in some modes of control while a virtual chest of the robotic assembly can remain stationary (e.g., in an instrument control mode).
- the position and orientation of the torso of the operator are subtracted from the position and orientation of the operator’s arms and/or hands. This subtraction allows the operator to move his or her torso without the robotic arms moving. Further disclosure control of movement of individual arms of a robotic assembly is provided in International Patent Application Publications WO 2022/094000 Al and WO 2021/231402 Al, each of which is incorporated by reference herein in its entirety.
- the camera assembly 44 is configured to provide the operator with image data 48, such as for example a live video feed of an operation or surgical site, as well as enable the operator to actuate and control the cameras forming part of the camera assembly 44.
- the camera assembly 44 can include one or more cameras (e.g., a pair of cameras), the optical axes of which are axially spaced apart by a selected distance, known as the inter-camera distance, to provide a stereoscopic view or image of the surgical site.
- the operator can control the movement of the cameras via movement of the hands via sensors coupled to the hands of the operator or via hand controllers 17 grasped or held by hands of the operator, thus enabling the operator to obtain a desired view of an operation site in an intuitive and natural manner.
- the operator can additionally control the movement of the camera via movement of the operator’s head.
- the camera assembly 44 is movable in multiple directions, including for example in yaw, pitch and roll directions relative to a direction of view.
- the components of the stereoscopic cameras can be configured to provide a user experience that feels natural and comfortable.
- the interaxial distance between the cameras can be modified to adjust the depth of the operation site perceived by the operator.
- the image or video data 48 generated by the camera assembly 44 can be displayed on the display 12.
- the display 12 includes an HMD
- the display can include the built-in sensing and tracking module 16A that obtains raw orientation data for the yaw, pitch and roll directions of the HMD as well as positional data in Cartesian space (x, y, z) of the HMD.
- positional and orientation data regarding an operator’s head can be provided via a separate head-tracking module.
- the sensing and tracking module 16A can be used to provide supplementary position and orientation tracking data of the display in lieu of or in addition to the built-in tracking system of the HMD.
- no head tracking of the operator is used or employed.
- images of the operator can be used by the sensing and tracking module 16A for tracking at least a portion of the operator’s head.
- FIG. 2A depicts an example robotic assembly 20, which is also referred to herein as a robotic subsystem, of a surgical robotic system 10 incorporated into or mounted onto a mobile patient cart in accordance with some embodiments.
- the robotic subsystem 20 includes the RSS 46, which, in turn includes the motor 40, the robotic arm assembly 42 having end-effectors 45, the camera assembly 44 having one or more cameras 47, and can also include the trocar 50 or a trocar mount.
- FIG. 2B depicts an example of an operator console 11 of the surgical robotic system 10 of the present disclosure in accordance with some embodiments.
- the operator console 11 includes a display 12, hand controllers 17, and also includes one or more additional controllers, such as a foot pedal array 19 for control of the robotic arm assembly 42, for control of the camera assembly 44, and for control of other aspects of the system.
- FIG. 2B also depicts the left hand controller subsystem 23 A and the right hand controller subsystem 23B of the operator console.
- the left hand controller subsystem 23 A includes and supports the left hand controller 17A
- the right hand controller subsystem 23B includes and supports the right hand controller 17B.
- the left hand controller subsystem 23 A can releasably connect to or engage the left hand controller 17A
- right hand controller subsystem 23B can releasably connect to or engage the right hand controller 17A
- the connections can be both physical and electronic so that the left hand controller subsystem 23 A and the right hand controller subsystem 23B can receive signals from the left hand controller 17A and the right hand controller 17B, respectively, including signals that convey inputs received from a user selection on a button or touch input device of the left hand controller 17A or the right hand controller 17B.
- Each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B can include components that enable a range of motion of the respective left hand controller 17A and right hand controller 17B, so that the left hand controller 17A and right hand controller 17B can be translated or displaced in three dimensions and can additionally move in the roll, pitch, and yaw directions. Additionally, each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B can register movement of the respective left hand controller 17A and right hand controller 17B in each of the forgoing directions and can send a signal providing such movement information to the processor 22 (as illustrated in FIG. 1) of the surgical robotic system 10.
- each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B can be configured to receive and connect to or engage different hand controllers (not illustrated).
- hand controllers with different configurations of buttons and touch input devices can be provided.
- hand controllers with a different shape can be provided. The hand controllers can be selected for compatibility with a particular surgical robotic system or a particular surgical robotic procedure or selected based upon preference of an operator with respect to the buttons and input devices or with respect to the shape of the hand controller in order to provide greater comfort and ease for the operator.
- FIG. 3 A schematically depicts a side view of the surgical robotic system 10 performing a surgery within an internal cavity 104 of a subject 100 in accordance with some embodiments and for some surgical procedures.
- FIG. 3B schematically depicts a top view of the surgical robotic system 10 performing the surgery within the internal cavity 104 of the subject 100.
- the subject 100 e.g., a patient
- an operation table 102 e.g., a surgical table 102
- an incision is made in the patient 100 to gain access to the internal cavity 104.
- the trocar 50 is then inserted into the patient 100 at a selected location to provide access to the internal cavity 104 or operation site.
- the RSS 46 can then be maneuvered into position over the patient 100 and the trocar 50.
- the RSS 46 includes a trocar mount that attaches to the trocar 50.
- the camera assembly 44 and the robotic arm assembly 42 can be coupled to the motor 40 and inserted individually and/or sequentially into the patient 100 through the trocar 50 and hence into the internal cavity 104 of the patient 100.
- references to insertion of the robotic arm assembly 42 and/or the camera assembly 44 into an internal cavity of a subject and disposing the robotic arm assembly 42 and/or the camera assembly 44 in the internal cavity of the subject are referring to the portions of the robotic arm assembly 42 and the camera assembly 44 that are intended to be in the internal cavity of the subject during use.
- the sequential insertion method has the advantage of supporting smaller trocars and thus smaller incisions can be made in the patient 100, thus reducing the trauma experienced by the patient 100.
- the camera assembly 44 and the robotic arm assembly 42 can be inserted in any order or in a specific order.
- the camera assembly 44 can be followed by a first robotic arm 42A of the robotic arm assembly 42 and then followed by a second robotic arm 42B of the robotic arm assembly 42 all of which can be inserted into the trocar 50 and hence into the internal cavity 104.
- the RSS 46 can move the robotic arm assembly 42 and the camera assembly 44 to an operation site manually or automatically controlled by the operator console 11.
- FIG. 4A is a perspective view of a robotic arm subassembly 21 in accordance with some embodiments.
- the robotic arm subassembly 21 includes a robotic arm 42 A, the endeffector 45 having an instrument tip 120 (e.g., monopolar scissors, needle driver/holder, bipolar grasper, or any other appropriate tool), a shaft 122 supporting the robotic arm 42 A.
- an instrument tip 120 e.g., monopolar scissors, needle driver/holder, bipolar grasper, or any other appropriate tool
- a distal end of the shaft 122 is coupled to the robotic arm 42A, and a proximal end of the shaft 122 is coupled to a housing 124 of the motor 40 (as illustrated in FIG. 2 A). At least a portion of the shaft 122 can be external to the internal cavity 104 (as illustrated in FIGS. 3A and 3B). At least a portion of the shaft 122 can be inserted into the internal cavity 104 (as illustrated in FIGS. 3A and 3B).
- FIG. 4B is a side view of the robotic arm assembly 42.
- the robotic arm assembly 42 includes a shoulder joint 126 forming a virtual shoulder, an elbow joint 128 having position sensors 132 (e.g., capacitive proximity sensors) and forming a virtual elbow, a wrist joint 130 forming a virtual wrist, and the end-effector 45 in accordance with some embodiments.
- the shoulder joint 126, the elbow joint 128, the wrist joint 130 can include a series of hinge and rotary joints to provide each arm with positionable, seven degrees of freedom, along with one additional grasping degree of freedom for the end-effector 45 in some embodiments.
- the surgical robotic system 10 as a whole has nine degrees of freedom.
- FIG. 5 illustrates a perspective front view of a portion of the robotic assembly 20 configured for insertion into an internal body cavity of a patient.
- the robotic assembly 20 includes a robotic arm 42 A and a robotic arm 42B.
- the two robotic arms 42 A and 42B can define, or at least partially define, a virtual chest 140 of the robotic assembly 20 in some embodiments.
- the virtual chest 140 (depicted as a triangle with dotted lines) can be defined by a chest plane extending between a first pivot point 142A of a most proximal joint of the robotic arm 42A (e.g., a shoulder joint 126), a second pivot point 142B of a most proximal joint of the robotic arm 42B, and a camera imaging center point 144 of the camera(s) 47.
- a pivot center 146 of the virtual chest 140 lies in the middle of the virtual chest 140.
- sensors in one or both of the robotic arm 42A and the robotic arm 42B can be used by the surgical robotic system 10 to determine a change in location in three-dimensional space of at least a portion of each or both of the robotic arms 42 A and 42B.
- sensors in one or both of the first robotic arm 42A and second robotic arm 42B can be used by the surgical robotic system 10 to determine a location in three- dimensional space of at least a portion of one robotic arm relative to a location in three- dimensional space of at least a portion of the other robotic arm.
- the camera assembly 44 is configured to obtain images from which the surgical robotic system 10 can determine relative locations in three-dimensional space.
- the camera assembly 44 can include multiple cameras, at least two of which are laterally displaced from each other relative to an imaging axis, and the system can be configured to determine a distance to features within the internal body cavity.
- a surgical robotic system including camera assembly and associated system for determining a distance to features can be found in International Patent Application Publication No. WO 2021/159409, entitled “System and Method for Determining Depth Perception In Vivo in a Surgical Robotic System,” and published August 12, 2021, which is incorporated by reference herein in its entirety.
- Information about the distance to features and information regarding optical properties of the cameras can be used by a system to determine relative locations in three-dimensional space.
- FIG. 6 illustrates transmission spectra 610 and 620 of a conventional optical filter with spatially uniform characteristics for incident angles of 0 and 30 degrees, respectively.
- Line 630 depicts ICG excitation and line 640 depicts ICG emission.
- Line 650 depicts fluorescein excitation and line 660 depicts fluorescein emission.
- the blocking band e.g., band having about 0% transmittance percentage
- the transmission band e.g., band having about 60% or more transmittance percentage
- a signal to background ratio decreases, compromising the quality of the fluorescence image.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Optics & Photonics (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Toxicology (AREA)
- General Physics & Mathematics (AREA)
- Robotics (AREA)
- Endoscopes (AREA)
Abstract
An optical filter (800A) for a wide-angle field-of-view fluorescent scene imager is provided. For example, an optical filter (800A) can include a first radial zone (810A) to receive a first plurality of light beams having a first incident angle range. The first radial zone (810A) can include one or more blocking wavelength bands and one or more transmission wavelength bands. The optical filter (800A) can further include a second radial zone (810B) configured to receive a second plurality of light beams having a second incident angle range greater than the first incident angle range. The second radial zone (810B) can maintain one or more excitation wavelength blocking characteristics within the one or more blocking wavelength bands and one or more emission wavelength transmittance characteristics within the one or more transmission wavelength bands across the first and second radial zones.
Description
WIDE-ANGLE FIELD-OF-VIEW FLUORESCENT SCENE IMAGER
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of priority to U.S. Provisional Patent Application No. 63/541,660 filed on September 29, 2023, the entire contents of which are incorporated herein by reference.
BACKGROUND
[0002] A process of fluorescence imaging can use a source of light of an excitation wavelength to illuminate a subject and stimulate an emission of light of another wavelength to make an image of features in a scene that are not easily visible under normal illumination. The subject can be an unaltered specimen, or one that has been dyed with special fluorescent dye(s) to highlight specific features of the specimen. For example, fluorescent based imaging provides surgeons visualization of anatomy and tissue activity not visible through normal visualization. One common form of fluorescence imaging used in surgery uses indocyanine green (ICG) dye which is injected into a patient’s bloodstream to image anatomical features and conditions such as tissue perfusion and blood flow. Multiple other dyes and autofluorescent capabilities allow potential different visualizations behaviors that aid surgeons in targeting the correct tissue to dissect or avoid.
[0003] A fluorescence imaging camera can be used for fluorescence imaging. The fluorescence imaging camera selectively blocks much stronger excitation light wavelength (background) that reflects back from the subject and transmits the weaker emission wavelength (signal) from the subject to create a useful image with a high signal -to- background ratio. Conventional filters can perform a selective blocking function to separate the emission and excitation. However, such filters do not ensure that the selective blocking function is maintained appropriately across a wide-angle field of view of the fluorescence imaging camera due to their limited incident angle ranges. That is, in a wide-angle field of view more light at the excitation wavelength outside the limited incident angle range can pass through the filters. This results in poor performance, such as reducing a signal-to-background ratio, obscuring a desired image, and the like.
SUMMARY
[0004] The present disclosure provides a multiple zone spectral optical filter (also referred to as “optical filter”). The optical filter can include a first radial zone to receive a first plurality of light beams having a first incident angle range. The first zone can include one or more blocking wavelength bands and one or more transmission wavelength bands. The optical filter can further include a second radial zone to receive a second plurality of light beams having a second incident angle range greater than the first incident angle range. The second radial zone can maintain one or more excitation wavelength blocking characteristics within the one or more blocking wavelength bands and one or more emission wavelength transmittance characteristics within the one or more transmission wavelength bands across the first and second zones. For example, the second radial zone can have the same excitation wavelength blocking characteristics and the emission wavelength transmittance characteristics as the first zone. As another example, a change in the excitation wavelength blocking characteristics and/or the emission wavelength transmittance characteristics between the first zone and the second zone can satisfy a change threshold describing a value or a value range indicative of a difference among different radial zones being less than a value or falling within a value range.
[0005] In some embodiments, the optical filter can include a third radial zone to receive a third plurality of light beams having a third incident angle range greater than the second incident angle range, wherein the third radial zone maintains the one or more excitation wavelength transmittance characteristics and the one or more emission wavelength transmittance characteristics across the first, and second radial zones.
[0006] In some embodiments, the optical filter can further include a fourth radial zone to receive a fourth plurality of light beams having a fourth incident angle range greater than the third incident angle range, wherein the fourth radial zone maintains the one or more excitation wavelength transmittance characteristics and the one or more emission wavelength transmittance characteristics across the first, second, and third radial zones.
[0007] In some embodiments, a multiple zone spectral optical filter as taught herein can be employed as part of a camera assembly. The present disclosure provides a camera assembly for a wide-angle field-of-view imaging. The camera assembly can include a lens assembly and an optical filter as taught herein. The optical filter can receive light having wide incident angles transmitted by the lens assembly. The camera assembly can further include one or
more image sensors having a wide-angle field-of-view and configured to capture light in one or more selected wavelength bands transmitted from the optical filter.
[0008] In some embodiments, a multiple zone spectral optical filter as taught herein can be employed as part of a laparoscope or surgical robotic system. In some embodiments, the present disclosure provides a robotic surgical system. The robotic surgical system can include a light source and a camera assembly having a multiple zone spectral optical filter as taught herein. In some embodiments, the robotic surgical system can perform multispectral imaging. For example, multispectral illumination with excitation light in multiple wavelength ranges (e.g., in the visible, near-infrared, and/or infrared spectra) and detection of emission light in multiple wavelength ranges (e.g., in the visible, near-infrared, and/or infrared spectra) may produce a combined multispectral image output displayed. The camera assembly can be a multispectral camera assembly that enables simultaneous imaging of non-visible light (e.g., near-infrared and/or infrared fluorescence) and visible light (e.g., visible light fluorescence) of an internal body space.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] These and other features and advantages of the present invention will be more fully understood by reference to the following detailed description in conjunction with the attached drawings in which like reference numerals refer to like elements throughout the different views. The drawings illustrate principals of the invention and, although not to scale, show relative dimensions.
[0010] FIG. l is a diagram illustrating an example surgical robotic system in accordance with some embodiments.
[0011] FIG. 2A is an example perspective view of a patient cart including a robotic support system coupled to a robotic subsystem of the surgical robotic system in accordance with some embodiments.
[0012] FIG. 2B is an example perspective view of an example operator console of a surgical robotic system of the present disclosure in accordance with some embodiments.
[0013] FIG. 3 A is a diagram illustrating an example side view of a surgical robotic system performing a surgery within an internal cavity of a subject in accordance with some embodiments.
[0014] FIG. 3B is a diagram illustrating an example top view of the surgical robotic system performing the surgery within the internal cavity of the subject of FIG. 3 A in accordance with some embodiments.
[0015] FIG. 4A is an example perspective view of a single robotic arm subsystem in accordance with some embodiments.
[0016] FIG. 4B is an example perspective side view of a single robotic arm of the single robotic arm subsystem of FIG. 4A in accordance with some embodiments.
[0017] FIG. 5 is an example perspective front view of a camera assembly and a robotic arm assembly in accordance with some embodiments.
[0018] FIG. 6 illustrates transmission spectra of an optical filter with spatially uniform characteristics for incident angles of 0 and 30 degrees, respectively.
[0019] FIG. 7 illustrates a wide-angle field-of-view fluorescence imager having an angular distribution of light rays as they enter and exit a camera lens stack to reach an image sensor. [0020] FIG. 8A illustrates an optical filter having four discrete radial zones in accordance with some embodiments.
[0021] FIG. 8B illustrates an optical filter having two discrete radial zones in accordance with some embodiments.
[0022] FIG. 8C illustrates an optical filter having continuously-varying radial zones in accordance with some embodiments.
[0023] FIG. 9 illustrates a fluorescence imaging system using the optical filter illustrated in FIG. 8A in accordance with some embodiments.
DETAILED DESCRIPTION
[0024] Fluorescence can help visualize blood vessels, ureters, cancer, nerves, tissue perfusion, amongst other tissue types and anatomical features. All types of fluorescence like dye, autofluorescence, and other types of differential visualization may be paired with a fluorescence imaging system. The fluorescence imaging system can employ filters on a camera assembly that selectively block specific frequencies of emitted light. Such imaging system can use wide-angle lenses that can image large areas by collecting light at large angles, for example, between about -60 degrees or more to between +60 degrees or more. The light rays travel through the wide-angle lenses at large angles and also impinge upon an image sensor at large angles. Conventional fluorescence imaging systems generally use multilayer filters to perform a selective blocking function to separate emission wavelengths and excitation wavelengths. However, such multilayer filters can be sensitive to incident
angles. Often, they can block excitation wavelengths at a limited incident angle range, such as between about -20 degrees and about +20 degrees. Outside of the limited incident angle range, the multilayer filters can allow more light at the excitation wavelength to pass through, reducing a signal-to-background ratio, and obscuring a desired image. The incident angle can be increased a little, but at the cost of multilayer complexity and use of more exotic materials and higher processing costs, which may not be suitable for surgical devices (e.g., surgical endoscopes, or the like).
[0025] Further, conventional optical filters can have spatially uniform characteristics (e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like are spatially uniform across an entire optical fitter) such that excitation wavelength blocking characteristics and emission wavelength transmittance characteristics vary for different incident angles. For example, as incident angles increase, the filter characteristics of conventional optical filters change with features moving to shorter or longer wavelengths. The blocking band for the excitation wavelength starts to transmit more, and the transmission band for the emission wavelength starts to transmit less. The signal to background ratio decreases, compromising the quality of the fluorescence image.
[0026] The present disclosure provides an optical filter allowing fluorescence imaging over a wide field of view with compact and inexpensive optics needed for surgical endoscopy. The optical filter as taught herein can include multiple radial zones to receive light beams incident on the zones over broad incident angle ranges (e.g., between about -35 degrees or more and about +35 degrees or more, between about -60 degrees or more and about +60 degrees or more, or the like). In some embodiments, characteristics (e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like) can be, for an individual radial zone, constant but optimized for a specific angular range. Examples are described with respect to FIGS. 8A and 8B. The characteristics for each radial zone can be different, but constant within the same radial zone. As incident angles increase, transmission and blocking bands move with the incident angles as described above where a blocking band for the excitation wavelength starts to transmit more, and a transmission band for the emission wavelength starts to transmit less, but characteristics of each radial zone can be adjusted so that movements of transmission and blocking bands can be tolerable for a corresponding wavelength range that each radial zone is designed for, thereby maintaining excitation wavelength blocking characteristics and emission wavelength transmittance characteristics across multiple radial zones over a broad incident angle range (e.g., between about -35 degrees or more and about +35 degrees or more, between about -60 degrees or more and
about +60 degrees or more, or the like). In some embodiments, characteristics (e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like) of an optical filter as taught herein can be continuously radially varying and can be optimized for continuous incident angle ranges that vary along radial directions in order to maintain excitation wavelength blocking characteristics and emission wavelength transmittance characteristics across multiple radial zones over a broad incident angle range (e.g., between about -35 degrees or more and about +35 degrees or more, between about -60 degrees or more and about +60 degrees or more, or the like). Examples are described with respect to FIG. 8C. Accordingly, optical filers as taught herein can overcome drawbacks of optical filters having spatially uniform characteristics and other drawbacks of optical filters. [0027] As taught herein, excitation wavelength blocking characteristics within blocking wavelength bands and emission wavelength transmittance characteristics within the transmission wavelength bands can be maintained across the multiple radial zones. For example, in response to different incident angles over a wide field of view, each radial zone can have excitation wavelength blocking characteristics satisfying a respective blocking characteristic threshold and the emission wavelength transmittance characteristics satisfying a respective transmittance characteristic threshold indicating that the excitation wavelength blocking characteristics (examples described below) and the emission wavelength transmittance characteristics (examples described below) can be constant across multiple radial zones over a broad incident angle range. For example, transmission wavelength bands for each radial zone can have transmission percentages satisfying a transmission percentage threshold (e.g., greater than about 80% or more or fall within a transmittance value range (e.g., about 80% to 100%). As another example, transmission wavelength bands for each radial zone can have full width-half maximum (FWHM) values satisfying an FWHM threshold (e.g., greater than about 10 nanometers or more or fall within an FWHM range (e.g., about 10 nanometers to 100 nanometers or more). Blocking wavelength bands for each radial zone can have an optical density value satisfying an optical density percentage threshold (e.g., greater than an optical density of 4 or more or fall within a threshold range (e.g., about OD4 to OD6 or more).
[0028] Examples of excitation wavelength blocking characteristics can include an optical density describing an amount of energy blocked or rejected by an optical filter (e.g., a high optical density value indicates low transmission, and low optical density indicates high transmission), a blocking range describing a wavelength interval used to denote a spectral region of energy that is attenuated by an optical filter, a cut-off wavelength denoting a
wavelength at which the transmission decreases to about 50% throughput, or the like. Examples of emission wavelength transmittance characteristics can include a transmission percentage describing an amount of energy transmitted by an optical filter, a transmission range describing a wavelength interval used to denote a spectral region of energy that is transmitted by an optical filter, a cut-on wavelength denoting a wavelength at which the transmission increases to about 50% throughput, a central wavelength describing a midpoint of spectral bandwidth over which an optical filter transmits, a bandwidth describing a wavelength range used to denote a specific part of a spectrum that passes incident energy through an optical filter, a full width-half maximum (FWHM) describing a spectral bandwidth over which an optical filter will transmit, a slope describing a bandwidth over which the filter transitions from high blocking to high transmission (e.g., about 10% transmission point to a about 80% transmission point), or the like.
[0029] In some embodiments, the multiple radial zones of the optical filter as taught herein are discrete zones or continuously varying zones. In some embodiments, the optical filter as taught herein can have an optical density of 4 or more for the blocking wavelength bands. In some embodiments, the optical filter as taught herein can have about 80% or more transmittance for the transmission wavelength bands. In some embodiments, a thickness of each radial zone can be less than about 0.3 millimeters (mm) (e.g., about 0.21 mm, or in a range of about 0.21 mm inclusive to about 0.3 mm inclusive). In some embodiments, the optical filter as taught herein can have a central radial zone located at the center of the optical having a diameter of about 2.6 mm. In some embodiments, the optical filter as taught herein can have dimensions of about (6.2 mm ± 0.1 mm) x about (5.7 mm ±0.1 mm). In some embodiments, substrate material for the optical filter as taught herein can be fused silica or equivalent. In some embodiments, at an interface between discrete radial zones, where one radial zone ends the next immediately begins. In some embodiments, the radial zones may overlap by a small margin in order to ensure that the light filtering occurs over the entire lens. This results in a small overlap zone in which the characteristics of the two adjacent zones would be filtered for. In some embodiments, this overlap zone may be undesirable and thus a light-blocking coating, for example a chrome coating, can be applied to this overlap zone to limit light passing through it. In some embodiments, an annular space may be left between the radial zones, thereby preventing overlap of the radial zones. Furthermore in some embodiments, the annular space between radial zones can be coated with light-blocking black chrome or other light-blocking materials known in the art. This light blocking layer results in a thin dark ring in the resulting image produced by the lens. In some embodiments, operating
temperature for each zone can be about 15°C to about 70°C. In some embodiments, an environmental temperature for an optical filter as taught herein can be about -10°C to 140°C. In some embodiments, an optical filter as taught herein can survive sterilization process temperatures up to 140°C (e.g., in a sealed, air environment).
[0030] In some embodiments, the optical filter as taught herein can include one or more blocking wavelength bands. For example, each radial zone can include a first blocking wavelength band in a visible wavelength range (e.g., in a wavelength range of about 470 nanometers (nm) to about 520 nm) and a second blocking wavelength band in a near infrared wavelength range (e.g., in a wavelength range of about 720 nm to about 845 nm). In some embodiments, a multiple zone spectral optical filter as taught herein can include one or more transmission wavelength bands. For example, each radial zone can include a first transmission wavelength band in a wavelength range of about 430 nm to about 465 nm, a second transmission wavelength band in a wavelength range of about 525 nm to about 715 nm and a third transmission wavelength band in a wavelength range of about 850 nm to about 880 nm. It should be understood that the blocking wavelength bands and the transmission wavelength bands can include any wavelength ranges, but are not limited to the above wavelength ranges. It should be also understood that the optical filter as taught herein can include any number of blocking wavelength bands and the transmission wavelength bands, but is not limited to the above number of bands.
[0031] Prior to providing additional specific description of the optical filter as taught herein with respect to FIGS. 6-9 a surgical robotic system in which some embodiments could be employed is described below with respect to FIGS. 1-5. In some embodiments, the optical filter as taught herein may be employed without the surgical robotic system. In some embodiments, the optical filter as taught herein may be employed in any imager or cameras. [0032] While various embodiments have been taught and described herein, it will be clear to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions can occur to those skilled in the art without departing from the invention. It can be understood that various alternatives to the embodiments taught herein can be employed.
[0033] Unless specifically stated or obvious from context, as used herein, the term “about” is understood as within a range of normal tolerance in the art, for example within 2 standard deviations of the mean. “About” can be understood as within 10%, 9%, 8%, 7%, 6%, 5%, 4%, 3%, 2%, 1%, 0.5%, 0.1%, 0.05%, or 0.01% of the stated value. Unless otherwise clear from the context, all numerical values provided herein are modified by the term “about.”
[0034] Although some example embodiments can be described herein or in documents incorporated by reference as employing a plurality of units to perform example processes, it is understood that example processes can also be performed by one or a plurality of modules. Additionally, it is understood that the term controller can refer to a hardware device that includes a memory and a processor and is specifically programmed to execute the processes described herein in accordance with some embodiments. In some embodiments, the memory is configured to store the modules and the processor is specifically configured to execute said modules to perform one or more processes which are described further below. In some embodiments, multiple different controllers or controllers or multiple different types of controllers or controllers can be employed in performing one or more processes. In some embodiments, different controllers or controllers can be implemented in different portions of a surgical robotic systems.
Surgical Robotic Systems
[0035] Some embodiments can be employed with a surgical robotic system. A system for robotic surgery can include a robotic subsystem. The robotic subsystem includes at least a portion, which can also be referred to herein as a robotic assembly herein, that can be inserted into a patient via a trocar through a single incision point or site. The portion inserted into the patient via a trocar is small enough to be deployed in vivo at the surgical site and is sufficiently maneuverable when inserted to be able to move within the body to perform various surgical procedures at multiple different points or sites. The portion inserted into the body that performs functional tasks can be referred to as a surgical robotic module, a surgical robotic module or a robotic assembly herein. The surgical robotic module can include multiple different submodules or parts that can be inserted into the trocar separately. The surgical robotic module, surgical robotic module or robotic assembly can include multiple separate robotic arms that are deployable within the patient along different or separate axes. These multiple separate robotic arms can be collectively referred to as a robotic arm assembly herein. Further, a surgical camera assembly can also be deployed along a separate axis. The surgical robotic module, surgical robotic module, or robotic assembly can also include the surgical camera assembly. Thus, the surgical robotic module, or robotic assembly employs multiple different components, such as a pair of robotic arms and a surgical or robotic camera assembly, each of which are deployable along different axes and are separately manipulatable, maneuverable, and movable. The robotic arms and the camera assembly that are disposable along separate and manipulatable axes is referred to herein as the Split Arm
(SA) architecture. The SA architecture is designed to simplify and increase efficiency of the insertion of robotic surgical instruments through a single trocar at a single insertion site, while concomitantly assisting with deployment of the surgical instruments into a surgical ready state as well as the subsequent removal of the surgical instruments through the trocar. By way of example, a surgical instrument can be inserted through the trocar to access and perform an operation in vivo in the abdominal cavity of a patient. In some embodiments, various surgical instruments can be used or employed, including but not limited to robotic surgical instruments, as well as other surgical instruments known in the art.
[0036] The systems, devices, and methods taught herein can be incorporated into and/or used with a robotic surgical device and associated system taught for example in United States Patent No. 10,285,765 and in PCT patent application Serial No. PCT/US2020/39203, and/or with the camera assembly and system taught in United States Publication No. 2019/0076199, and/or the systems and methods of exchanging surgical tools in an implantable surgical robotic system taught in PCT patent application Serial No. PCT/US2021/058820, where the content and teachings of all of the foregoing patents, patent applications and publications are incorporated herein by reference herein in their entirety. The surgical robotic module that forms part of the present invention can form part of a surgical robotic system that includes a user workstation that includes appropriate sensors and displays, and a robot support system (RSS) for interacting with and supporting the robotic subsystem of the present invention in some embodiments. The robotic subsystem includes a motor and a surgical robotic module that includes one or more robotic arms and one or more camera assemblies in some embodiments. The robotic arms and camera assembly can form part of a single support axis robotic system, can form part of the split arm (SA) architecture robotic system, or can have another arrangement. The robot support system can provide multiple degrees of freedom such that the robotic module can be maneuvered within the patient into a single position or multiple different positions. In one embodiment, the robot support system can be directly mounted to a surgical table or to the floor or ceiling within an operating room. In another embodiment, the mounting is achieved by various fastening means, including but not limited to, clamps, screws, or a combination thereof. In other embodiments, the structure can be free standing. The robot support system can mount a motor assembly that is coupled to the surgical robotic module, which includes the robotic arm assembly and the camera assembly. The motor assembly can include gears, motors, drivetrains, electronics, and the like, for powering the components of the surgical robotic module.
[0037] The robotic arm assembly and the camera assembly are capable of multiple degrees of freedom of movement. According to some embodiments, when the robotic arm assembly and the camera assembly are inserted into a patient through the trocar, they are capable of movement in at least the axial, yaw, pitch, and roll directions. The robotic arms of the robotic arm assembly are designed to incorporate and employ a multi-degree of freedom of movement robotic arm with an end effector mounted at a distal end thereof that corresponds to a wrist area or joint of the user. In other embodiments, the working end (e.g., the end effector end) of the robotic arm is designed to incorporate and use or employ other robotic surgical instruments, such as for example the surgical instruments set forth in U.S. Pub. No. 2018/0221102, the entire contents of which are herein incorporated by reference.
[0038] Like numerical identifiers are used throughout the figures to refer to the same elements.
[0039] FIG. 1 is a schematic illustration of an example surgical robotic system 10 in which aspects of the present disclosure can be employed in accordance with some embodiments of the present disclosure. The surgical robotic system 10 includes an operator console 11 and a robotic subsystem 20 in accordance with some embodiments.
[0040] The operator console 11 includes a display 12, an image computing module 14, which can be a three-dimensional (3D) computing module, hand controllers 17 having a sensing and tracking module 16, and a computing module 18. Additionally, the operator console 11 can include a foot pedal array 19 including a plurality of pedals. The image computing module 14 can include a graphical user interface 39. The graphical user interface 39, the controller 26 or the image Tenderer 30, or both, can render one or more images or one or more graphical user interface elements on the graphical user interface 39. For example, a pillar box associated with a mode of operating the surgical robotic system 10, or any of the various components of the surgical robotic system 10, can be rendered on the graphical user interface 39. Also live video footage captured by a camera assembly 44 can also be rendered by the controller 26 or the image Tenderer 30 on the graphical user interface 39.
[0041] The operator console 11 can include a visualization system 9 that includes a display 12 which can be any selected type of display for displaying information, images or video generated by the image computing module 14, the computing module 18, and/or the robotic subsystem 20. The display 12 can include or form part of, for example, a head-mounted display (HMD), an augmented reality (AR) display (e.g., an AR display, or AR glasses in combination with a screen or display), a screen or a display, a two-dimensional (2D) screen or display, a three-dimensional (3D) screen or display, and the like. The display 12 can also
include an optional sensing and tracking module 16A. In some embodiments, the display 12 can include an image display for outputting an image from a camera assembly 44 of the robotic subsystem 20.
[0042] The hand controllers 17 are configured to sense a movement of the operator’s hands and/or arms to manipulate the surgical robotic system 10. The hand controllers 17 can include the sensing and tracking module 16, circuity, and/or other hardware. The sensing and tracking module 16 can include one or more sensors or detectors that sense movements of the operator’s hands. In some embodiments, the one or more sensors or detectors that sense movements of the operator’s hands are disposed in the hand controllers 17 that are grasped by or engaged by hands of the operator. In some embodiments, the one or more sensors or detectors that sense movements of the operator’s hands are coupled to the hands and/or arms of the operator. For example, the sensors of the sensing and tracking module 16 can be coupled to a region of the hand and/or the arm, such as the fingers, the wrist region, the elbow region, and/or the shoulder region. Additional sensors can also be coupled to a head and/or neck region of the operator in some embodiments. In some embodiments, the sensing and tracking module 16 can be external and coupled to the hand controllers 17 via electricity components and/or mounting hardware. In some embodiments, the optional sensor and tracking module 16A can sense and track movement of one or more of an operator’s head, of at least a portion of an operator’s head, an operator’s eyes or an operator’s neck based, at least in part, on imaging of the operator in addition to or instead of by a sensor or sensors attached to the operator’s body.
[0043] In some embodiments, the sensing and tracking module 16 can employ sensors coupled to the torso of the operator or any other body part. In some embodiments, the sensing and tracking module 16 can employ in addition to the sensors an Inertial Momentum Unit (IMU) having for example an accelerometer, gyroscope, magnetometer, and a motion processor. The addition of a magnetometer allows for reduction in sensor drift about a vertical axis. In some embodiments, the sensing and tracking module 16 also include sensors placed in surgical material such as gloves, surgical scrubs, or a surgical gown. The sensors can be reusable or disposable. In some embodiments, sensors can be disposed external of the operator, such as at fixed locations in a room, such as an operating room. The external sensors 37 can generate external data 36 that can be processed by the computing module 18 and hence employed by the surgical robotic system 10.
[0044] The sensors generate position and/or orientation data indicative of the position and/or orientation of the operator’s hands and/or arms. The sensing and tracking modules 16 and/or
16A can be utilized to control movement (e.g., changing a position and/or an orientation) of the camera assembly 44 and robotic arm assembly 42 of the robotic subsystem 20. The tracking and position data 34 generated by the sensing and tracking module 16 can be conveyed to the computing module 18 for processing by at least one processor 22.
[0045] The computing module 18 can determine or calculate, from the tracking and position data 34 and 34A, the position and/or orientation of the operator’s hands or arms, and in some embodiments of the operator’s head as well, and convey the tracking and position data 34 and 34A to the robotic subsystem 20. The tracking and position data 34, 34A can be processed by the processor 22 and can be stored for example in the storage 24. The tracking and position data 34 and 34A can also be used by the controller 26, which in response can generate control signals for controlling movement of the robotic arm assembly 42 and/or the camera assembly 44. For example, the controller 26 can change a position and/or an orientation of at least a portion of the camera assembly 44, of at least a portion of the robotic arm assembly 42, or both. In some embodiments, the controller 26 can also adjust the pan and tilt of the camera assembly 44 to follow the movement of the operator’s head.
[0046] The robotic subsystem 20 can include a robot support system (RSS) 46 having a motor 40 and a trocar 50 or trocar mount, the robotic arm assembly 42, and the camera assembly 44. The robotic arm assembly 42 and the camera assembly 44 can form part of a single support axis robot system, such as that taught and described in U.S. Patent No. 10,285,765, or can form part of a split arm (SA) architecture robot system, such as that taught and described in PCT Patent Application No. PCT/US2020/039203, both of which are incorporated herein by reference in their entirety.
[0047] The robotic subsystem 20 can employ multiple different robotic arms that are deployable along different or separate axes. In some embodiments, the camera assembly 44, which can employ multiple different camera elements, can also be deployed along a common separate axis. Thus, the surgical robotic system 10 can employ multiple different components, such as a pair of separate robotic arms and the camera assembly 44, which are deployable along different axes. In some embodiments, the robotic arm assembly 42 and the camera assembly 44 are separately manipulatable, maneuverable, and movable. The robotic subsystem 20, which includes the robotic arm assembly 42 and the camera assembly 44, is disposable along separate manipulatable axes, and is referred to herein as an SA architecture. The SA architecture is designed to simplify and increase efficiency of the insertion of robotic surgical instruments through a single trocar at a single insertion point or site, while concomitantly assisting with deployment of the surgical instruments into a surgical ready
state, as well as the subsequent removal of the surgical instruments through the trocar 50 as further described below.
[0048] The RSS 46 can include the motor 40 and the trocar 50 or a trocar mount. The RSS 46 can further include a support member that supports the motor 40 coupled to a distal end thereof. The motor 40 in turn can be coupled to the camera assembly 44 and to each of the robotic arm assembly 42. The support member can be configured and controlled to move linearly, or in any other selected direction or orientation, one or more components of the robotic subsystem 20. In some embodiments, the RSS 46 can be free standing. In some embodiments, the RSS 46 can include the motor 40 that is coupled to the robotic subsystem 20 at one end and to an adjustable support member or element at an opposed end [0049] The motor 40 can receive the control signals generated by the controller 26. The motor 40 can include gears, one or more motors, drivetrains, electronics, and the like, for powering and driving the robotic arm assembly 42 and the cameras assembly 44 separately or together. The motor 40 can also provide mechanical power, electrical power, mechanical communication, and electrical communication to the robotic arm assembly 42, the camera assembly 44, and/or other components of the RSS 46 and robotic subsystem 20. The motor 40 can be controlled by the computing module 18. The motor 40 can thus generate signals for controlling one or more motors that in turn can control and drive the robotic arm assembly 42, including for example the position and orientation of each robot joint of each robotic arm, as well as the camera assembly 44. The motor 40 can further provide for a translational or linear degree of freedom that is first utilized to insert and remove each component of the robotic subsystem 20 through the trocar 50. The motor 40 can also be employed to adjust the inserted depth of each robotic arm of the robotic arm assembly 42 when inserted into the patient 100 through the trocar 50.
[0050] The trocar 50 is a medical device that can be made up of an awl (which can be a metal or plastic sharpened or non-bladed tip), a cannula (essentially a hollow tube), and a seal in some embodiments. The trocar 50 can be used to place at least a portion of the robotic subsystem 20 in an interior cavity of a subject (e.g., a patient) and can withdraw gas and/or fluid from a body cavity. The robotic subsystem 20 can be inserted through the trocar 50 to access and perform an operation in vivo in a body cavity of a patient. In some embodiments, the robotic subsystem 20 can be supported, at least in part, by the trocar 50 or a trocar mount with multiple degrees of freedom such that the robotic arm assembly 42 and the camera assembly 44 can be maneuvered within the patient into a single position or multiple different positions. In some embodiments, the robotic arm assembly 42 and camera assembly 44 can
be moved with respect to the trocar 50 or a trocar mount with multiple different degrees of freedom such that the robotic arm assembly 42 and the camera assembly 44 can be maneuvered within the patient into a single position or multiple different positions.
[0051] In some embodiments, the RSS 46 can further include an optional controller for processing input data from one or more of the system components (e.g., the display 12, the sensing and tracking module 16, the robotic arm assembly 42, the camera assembly 44, and the like), and for generating control signals in response thereto. The motor 40 can also include a storage element for storing data in some embodiments.
[0052] The robotic arm assembly 42 can be controlled to follow the scaled-down movement or motion of the operator’s arms and/or hands as sensed by the associated sensors in some embodiments and in some modes of operation. The robotic arm assembly 42 include a first robotic arm including a first end effector at distal end of the first robotic arm, and a second robotic arm including a second end effector disposed at a distal end of the second robotic arm. In some embodiments, the robotic arm assembly 42 can have portions or regions that can be associated with movements associated with the shoulder, elbow, and wrist joints as well as the fingers of the operator. For example, the robotic elbow joint can follow the position and orientation of the human elbow, and the robotic wrist joint can follow the position and orientation of the human wrist. The robotic arm assembly 42 can also have associated therewith end regions that can terminate in end-effectors that follow the movement of one or more fingers of the operator in some embodiments, such as for example the index finger as the user pinches together the index finger and thumb. In some embodiments, while the robotic arm assembly 42 can follow movement of the arms of the operator in some modes of control while a virtual chest of the robotic assembly can remain stationary (e.g., in an instrument control mode). In some embodiments, the position and orientation of the torso of the operator are subtracted from the position and orientation of the operator’s arms and/or hands. This subtraction allows the operator to move his or her torso without the robotic arms moving. Further disclosure control of movement of individual arms of a robotic assembly is provided in International Patent Application Publications WO 2022/094000 Al and WO 2021/231402 Al, each of which is incorporated by reference herein in its entirety.
[0053] The camera assembly 44 is configured to provide the operator with image data 48, such as for example a live video feed of an operation or surgical site, as well as enable the operator to actuate and control the cameras forming part of the camera assembly 44. In some embodiments, the camera assembly 44 can include one or more cameras (e.g., a pair of cameras), the optical axes of which are axially spaced apart by a selected distance, known as
the inter-camera distance, to provide a stereoscopic view or image of the surgical site. In some embodiments, the operator can control the movement of the cameras via movement of the hands via sensors coupled to the hands of the operator or via hand controllers 17 grasped or held by hands of the operator, thus enabling the operator to obtain a desired view of an operation site in an intuitive and natural manner. In some embodiments, the operator can additionally control the movement of the camera via movement of the operator’s head. The camera assembly 44 is movable in multiple directions, including for example in yaw, pitch and roll directions relative to a direction of view. In some embodiments, the components of the stereoscopic cameras can be configured to provide a user experience that feels natural and comfortable. In some embodiments, the interaxial distance between the cameras can be modified to adjust the depth of the operation site perceived by the operator.
[0054] The image or video data 48 generated by the camera assembly 44 can be displayed on the display 12. In embodiments in which the display 12 includes an HMD, the display can include the built-in sensing and tracking module 16A that obtains raw orientation data for the yaw, pitch and roll directions of the HMD as well as positional data in Cartesian space (x, y, z) of the HMD. In some embodiments, positional and orientation data regarding an operator’s head can be provided via a separate head-tracking module. In some embodiments, the sensing and tracking module 16A can be used to provide supplementary position and orientation tracking data of the display in lieu of or in addition to the built-in tracking system of the HMD. In some embodiments, no head tracking of the operator is used or employed. In some embodiments, images of the operator can be used by the sensing and tracking module 16A for tracking at least a portion of the operator’s head.
[0055] FIG. 2A depicts an example robotic assembly 20, which is also referred to herein as a robotic subsystem, of a surgical robotic system 10 incorporated into or mounted onto a mobile patient cart in accordance with some embodiments. In some embodiments, the robotic subsystem 20 includes the RSS 46, which, in turn includes the motor 40, the robotic arm assembly 42 having end-effectors 45, the camera assembly 44 having one or more cameras 47, and can also include the trocar 50 or a trocar mount.
[0056] FIG. 2B depicts an example of an operator console 11 of the surgical robotic system 10 of the present disclosure in accordance with some embodiments. The operator console 11 includes a display 12, hand controllers 17, and also includes one or more additional controllers, such as a foot pedal array 19 for control of the robotic arm assembly 42, for control of the camera assembly 44, and for control of other aspects of the system.
[0057] FIG. 2B also depicts the left hand controller subsystem 23 A and the right hand controller subsystem 23B of the operator console. The left hand controller subsystem 23 A includes and supports the left hand controller 17A and the right hand controller subsystem 23B includes and supports the right hand controller 17B. In some embodiments, the left hand controller subsystem 23 A can releasably connect to or engage the left hand controller 17A, and right hand controller subsystem 23B can releasably connect to or engage the right hand controller 17A. In some embodiments, the connections can be both physical and electronic so that the left hand controller subsystem 23 A and the right hand controller subsystem 23B can receive signals from the left hand controller 17A and the right hand controller 17B, respectively, including signals that convey inputs received from a user selection on a button or touch input device of the left hand controller 17A or the right hand controller 17B.
[0058] Each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B can include components that enable a range of motion of the respective left hand controller 17A and right hand controller 17B, so that the left hand controller 17A and right hand controller 17B can be translated or displaced in three dimensions and can additionally move in the roll, pitch, and yaw directions. Additionally, each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B can register movement of the respective left hand controller 17A and right hand controller 17B in each of the forgoing directions and can send a signal providing such movement information to the processor 22 (as illustrated in FIG. 1) of the surgical robotic system 10.
[0059] In some embodiments, each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B can be configured to receive and connect to or engage different hand controllers (not illustrated). For example, hand controllers with different configurations of buttons and touch input devices can be provided. Additionally, hand controllers with a different shape can be provided. The hand controllers can be selected for compatibility with a particular surgical robotic system or a particular surgical robotic procedure or selected based upon preference of an operator with respect to the buttons and input devices or with respect to the shape of the hand controller in order to provide greater comfort and ease for the operator.
[0060] FIG. 3 A schematically depicts a side view of the surgical robotic system 10 performing a surgery within an internal cavity 104 of a subject 100 in accordance with some embodiments and for some surgical procedures. FIG. 3B schematically depicts a top view of the surgical robotic system 10 performing the surgery within the internal cavity 104 of the subject 100. The subject 100 (e.g., a patient) is placed on an operation table 102 (e.g., a
surgical table 102). In some embodiments, and for some surgical procedures, an incision is made in the patient 100 to gain access to the internal cavity 104. The trocar 50 is then inserted into the patient 100 at a selected location to provide access to the internal cavity 104 or operation site. The RSS 46 can then be maneuvered into position over the patient 100 and the trocar 50. In some embodiments, the RSS 46 includes a trocar mount that attaches to the trocar 50. The camera assembly 44 and the robotic arm assembly 42 can be coupled to the motor 40 and inserted individually and/or sequentially into the patient 100 through the trocar 50 and hence into the internal cavity 104 of the patient 100. Although the camera assembly 44 and the robotic arm assembly 42 can include some portions that remain external to the subject’s body in use, references to insertion of the robotic arm assembly 42 and/or the camera assembly 44 into an internal cavity of a subject and disposing the robotic arm assembly 42 and/or the camera assembly 44 in the internal cavity of the subject are referring to the portions of the robotic arm assembly 42 and the camera assembly 44 that are intended to be in the internal cavity of the subject during use. The sequential insertion method has the advantage of supporting smaller trocars and thus smaller incisions can be made in the patient 100, thus reducing the trauma experienced by the patient 100. In some embodiments, the camera assembly 44 and the robotic arm assembly 42 can be inserted in any order or in a specific order. In some embodiments, the camera assembly 44 can be followed by a first robotic arm 42A of the robotic arm assembly 42 and then followed by a second robotic arm 42B of the robotic arm assembly 42 all of which can be inserted into the trocar 50 and hence into the internal cavity 104. Once inserted into the patient 100, the RSS 46 can move the robotic arm assembly 42 and the camera assembly 44 to an operation site manually or automatically controlled by the operator console 11.
[0061] Further disclosure regarding control of movement of individual arms of a robotic arm assembly is provided in International Patent Application Publications WO 2022/094000 Al and WO 2021/231402 Al, each of which is incorporated by reference herein in its entirety. [0062] FIG. 4A is a perspective view of a robotic arm subassembly 21 in accordance with some embodiments. The robotic arm subassembly 21 includes a robotic arm 42 A, the endeffector 45 having an instrument tip 120 (e.g., monopolar scissors, needle driver/holder, bipolar grasper, or any other appropriate tool), a shaft 122 supporting the robotic arm 42 A. A distal end of the shaft 122 is coupled to the robotic arm 42A, and a proximal end of the shaft 122 is coupled to a housing 124 of the motor 40 (as illustrated in FIG. 2 A). At least a portion of the shaft 122 can be external to the internal cavity 104 (as illustrated in FIGS. 3A and 3B).
At least a portion of the shaft 122 can be inserted into the internal cavity 104 (as illustrated in FIGS. 3A and 3B).
[0063] FIG. 4B is a side view of the robotic arm assembly 42. The robotic arm assembly 42 includes a shoulder joint 126 forming a virtual shoulder, an elbow joint 128 having position sensors 132 (e.g., capacitive proximity sensors) and forming a virtual elbow, a wrist joint 130 forming a virtual wrist, and the end-effector 45 in accordance with some embodiments. The shoulder joint 126, the elbow joint 128, the wrist joint 130 can include a series of hinge and rotary joints to provide each arm with positionable, seven degrees of freedom, along with one additional grasping degree of freedom for the end-effector 45 in some embodiments. In some embodiments, the surgical robotic system 10 as a whole has nine degrees of freedom.
[0064] FIG. 5 illustrates a perspective front view of a portion of the robotic assembly 20 configured for insertion into an internal body cavity of a patient. The robotic assembly 20 includes a robotic arm 42 A and a robotic arm 42B. The two robotic arms 42 A and 42B can define, or at least partially define, a virtual chest 140 of the robotic assembly 20 in some embodiments. In some embodiments, the virtual chest 140 (depicted as a triangle with dotted lines) can be defined by a chest plane extending between a first pivot point 142A of a most proximal joint of the robotic arm 42A (e.g., a shoulder joint 126), a second pivot point 142B of a most proximal joint of the robotic arm 42B, and a camera imaging center point 144 of the camera(s) 47. A pivot center 146 of the virtual chest 140 lies in the middle of the virtual chest 140.
[0065] In some embodiments, sensors in one or both of the robotic arm 42A and the robotic arm 42B can be used by the surgical robotic system 10 to determine a change in location in three-dimensional space of at least a portion of each or both of the robotic arms 42 A and 42B. In some embodiments, sensors in one or both of the first robotic arm 42A and second robotic arm 42B can be used by the surgical robotic system 10 to determine a location in three- dimensional space of at least a portion of one robotic arm relative to a location in three- dimensional space of at least a portion of the other robotic arm.
[0066] In some embodiments, the camera assembly 44 is configured to obtain images from which the surgical robotic system 10 can determine relative locations in three-dimensional space. For example, the camera assembly 44 can include multiple cameras, at least two of which are laterally displaced from each other relative to an imaging axis, and the system can be configured to determine a distance to features within the internal body cavity. Further disclosure regarding a surgical robotic system including camera assembly and associated system for determining a distance to features can be found in International Patent Application
Publication No. WO 2021/159409, entitled “System and Method for Determining Depth Perception In Vivo in a Surgical Robotic System,” and published August 12, 2021, which is incorporated by reference herein in its entirety. Information about the distance to features and information regarding optical properties of the cameras can be used by a system to determine relative locations in three-dimensional space.
Optical Filter
[0067] FIG. 6 illustrates transmission spectra 610 and 620 of a conventional optical filter with spatially uniform characteristics for incident angles of 0 and 30 degrees, respectively. Line 630 depicts ICG excitation and line 640 depicts ICG emission. Line 650 depicts fluorescein excitation and line 660 depicts fluorescein emission. As the incident angle increases from 0 degrees to 30 degrees, the transmission spectrum 610 moves to shorter wavelengths as illustrated in the transmission spectrum 620. The blocking band (e.g., band having about 0% transmittance percentage) for the excitation wavelength starts to transmit more, and the transmission band (e.g., band having about 60% or more transmittance percentage) for the emission wavelength starts to transmit less. Thus, a signal to background ratio decreases, compromising the quality of the fluorescence image.
[0068] FIG. 7 illustrates a wide-angle field-of-view fluorescence imager 700 (also referred to as an imager 700) having an angular distribution of light rays 710 as they enter and exit a lens stack 720 to reach an image sensor 730. As used herein, an imager can refer to a device that captures images and/or videos by detecting information (e.g., physical properties, optical properties, and/or spectral properties) associated with one or more objects. A fluorescence imager can refer to a device that captures images and/or videos by detecting fluorescence emitted from one or more objects. Fluorescence images can be generated from a variety of techniques including, but not limited to, imaging probes, and spectroscopy. The imager 700 can include a window 740, a lens stack 720, an optical filter 800 and an image sensor 730. The window 740 can receive light rays 710. The lens stack 720 can focus the light rays 710 onto the optical filter 800. The optical filter 800 can be configured to block excitation light and transmit emission light to the image sensor 730 as further described with respect to FIGS. 8A-8C. The image sensor 730 can create a fluorescence image using the emission light. As illustrated in FIG. 7, light rays 710 can be collected over a wide field of view of the imager 700. A large angular extent of the light rays 710 can be incident on the optical filter 800. As illustrated in FIG. 6, conventional blocking filter characteristics change with the incident angle of the light passing into them. This change is a shift of blocking/transmission ranges to
shorter wavelengths with increasing incidence angle. In contrast, the optical filter 800 can be optimized for different incident angles to maintain excitation wavelength blocking characteristics and emission wavelength transmittance characteristics over a broad incident angle range as descried with respect to FIGS. 8A-8C. In some embodiments, the imager 700 can be the camera assembly 44 as described with respect to FIGS. 1 and 3. In some embodiments, the imager 700 can be employed for any other fluorescence imaging systems. [0069] FIG. 8 A illustrates an optical filter 800 A having four discrete radial zones 810A- 810D in accordance with some embodiments. “Discrete radial zones” can refer to zones that are optimized for discrete incident angle ranges (e.g., individual, distinct and/or unconnected incident angle ranges). In some embodiments, discrete radial zones can have annular space between any two neighboring radial zones. In some embodiments, the annular space between radial zones can be coated with light-blocking black chrome.
[0070] As illustrated in FIG. 8A, characteristics (e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like) for each of the four discrete radial zones 810A-810D can be constant, but optimized for a specific angular range, (e.g., al, a2, a3, or a4 illustrated in FIG. 8 A) in order to maintain excitation wavelength blocking characteristics and emission wavelength transmittance characteristics over a broad incident angle range (e.g., between about -35 degrees or more and about +35 degrees or more, between about -60 degrees or more and about +60 degrees or more, or the like). The characteristics can be different between radial zones, but constant within the same radial zone. Different radial zones can be optimized for different incident angle ranges. For example, the first zone 810A can be optimized for a first incident angle range (e.g., between about 0 degree and about ±20 degrees) by defining (e.g., changing, adjusting, designing or the like) characteristics of the first zone 810A. Similarly, the second zone 810B can be optimized for a second incident angle range (e.g., between about +20 degrees and about +30 degrees and/or between about -20 degrees and about -30 degrees) by defining (e.g., changing, adjusting, designing, or the like) characteristics of the first zone 810B. The third zone 810C can be optimized for a third incident angle range (e.g., between about +30 degrees and about +40 degrees and/or between about -30 degrees and about -40 degrees) by defining (e.g., changing, adjusting, designing, or the like) characteristics of the first zone 810C. The fourth zone 810D can be optimized for a fourth incident angle range (e.g., between about +40 degrees and about +60 degrees and/or between about -40 degrees and about -60 degrees) by defining characteristics of the first zone 810D. Accordingly, different radial zones can have different characteristics to maintain excitation wavelength blocking characteristics and
emission wavelength transmittance characteristics across multiple radial zones over a broad incident angle range.
[0071] FIG. 8B illustrates an optical filter 800B having two discrete radial zones 810A’ and 810B’ in accordance with some embodiments. Characteristics (e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like) for each of the discrete radial zones 810A’ and 81 OB’ can be constant, but optimized for a specific angular range in order to maintain excitation wavelength blocking characteristics and emission wavelength transmittance characteristics over a broad incident angle range (e.g., between about -35 degrees or more and about +35 degrees or more, between about -60 degrees or more and about +60 degrees or more, or the like). The characteristics for each radial can be different, but constant within the same radial zone. Different radial zones can be optimized for different incident angle ranges. For example, the first zone 810A’ can be optimized for a first incident angle range (e.g., between about 0 degree and about ±20 degrees) by defining characteristics of the first zone 810A. Similarly, the second zone 81 OB’ can be optimized for a second incident angle range (e.g., between about +20 degrees and about +35 degrees and/or between about -20 degrees and about -35 degrees, or between about +20 degrees and about +60 degrees and/or between about -20 degrees and about -60 degrees) by defining characteristics of the first zone 810B. Discrete radial zones 810A’ and 810B’ can have an annular space between them. The annular space can be coated with light-blocking black chrome.
[0072] FIG. 8C illustrates an optical filter 800C having continuously-varying radial zones 820 in accordance with some embodiments. “Continuously-varying radial zones” can refer to zones that are optimized for continuous incident angle ranges that vary along radial directions.
[0073] Characteristics (e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like) of the zones 820 in can be continuously radially varying and can be optimized for continuous incident angle ranges varying along radial directions in order to maintain excitation wavelength blocking characteristics and emission wavelength transmittance characteristics over a broad incident angle range (e.g., between about -35 degrees or more and about +35 degrees or more, between about -60 degrees or more and about +60 degrees or more, or the like).
[0074] It should be understood that different incident angle ranges can be other angular ranges, but are not limited to the angular ranges described with respect to FIGS. 8A-8C. The incident angle range for each radial zone can be determined based on the radially-varying characteristics of each radial zone. For example, coating, the number of layers, layer
thickness, and/or materials can be different for different incident angle ranges. As the incident angle increases, coating, the number of layers, layer thickness, and/or materials of a radial zone can be optimized to prevent a transmission spectrum of that radial zone from moving to shorter wavelengths, thereby overcoming drawbacks of optical filters having spatially uniform characteristics as illustrated in FIG. 6.
[0075] In some embodiments, with respect to FIGS. 8A-8C, the optical filters 800A-800C can have an optical density of 4 or more for the blocking wavelength bands. In some embodiments, the optical filter 800A-800C can have about 80% or more transmittance for the transmission wavelength bands. In some embodiments, a thickness of each radial zone 810/820 can be less than about 0.3 mm (e.g., in a range of about 0.21 mm to about 0.3 mm). In some embodiments, the optical filters 800A-800C can have a central radial zone located at the center of the optical filter having a diameter of about 2.6 mm. In some embodiments, the optical filters 800A-800C can have dimensions of about 6.2 mm x about 5.7 mm. In some embodiments, substrate material for the optical filters 800A-800C can be fused silica or equivalent.
[0076] In some embodiments, the optical filters 800A-800C can include one or more blocking wavelength bands. For example, each radial zone can include a first blocking wavelength band in a visible wavelength range (e.g., in a wavelength range of about 470 nm to about 520 nm) and a second blocking wavelength band in a near infrared wavelength range (e.g., in a wavelength range of about 720 nm to about 845 nm). In some embodiments, the optical filters 800A-800C can include one or more transmission wavelength bands. For example, each radial zone can include a first transmission wavelength band in a wavelength range of about 430 nm to about 465 nm, a second transmission wavelength band in a wavelength range of about 525 nm to about 715 nm and a third transmission wavelength band in a wavelength range of about 850 nm to about 880 nm. It should be understood that the blocking wavelength bands and the transmission wavelength bands can include any wavelength ranges, but are not limited to the above wavelength ranges. It should be also understood that the optical filters 800A-800C can include any number of blocking wavelength bands and the transmission wavelength bands, but are not limited to the above number of bands.
[0077] FIG. 9 illustrates a fluorescence imaging system 900 having the optical filter 800A in accordance with some embodiments. The fluorescence imaging system 900 can include an excitation wavelength source 910 and the imager 700’ having the optical filter 800A. The imager 700’ can include a camera lens, the optical filter 800A and the image sensor 730. The
excitation wavelength source 910 (e.g., laser, light-emitting diode (LED), or the like) can illuminate a specimen 920 with fluorescent properties. For example, an in-vivo biological tissue within an internal cavity of a patient and/or in-vitro biological tissue can be labeled by ICG and/or fluorescein. The excitation wavelength source 910 can direct excitation light rays 930 onto the specimen 920 with fluorescent properties, which emits emission wavelength rays 940 as well as reflecting excitation rays 930 onto the camera lens 720, which focuses the light rays onto the optical filter 800A, which has radial zones, each radial zone optimized for a specific incident angle range having incident angles of the excitation rays 930 and the emission light rays 940 incident upon that radial zone. The excitation rays 930 can be blocked by the optical filter 810A and the emission rays 940 can pass through the optical filter 800A onto the image sensor 730 to create an image 950 on the image sensor 730.
[0078] In some embodiments, the fluorescence imaging system 900 can be a multi spectral imaging system. For example, the excitation wavelength source 910 can perform multispectral illumination with excitation light in multiple wavelength ranges. In some embodiments, the excitation wavelength source 910 can include one or more light sources. For example, a first light source can be a set of white light LEDs to provide visible imaging. A second light source can be a set of green LEDs to provide an excitation wavelength at about 490 nm for the fluorescein imaging. A third light source can be a vertical -cavity surface-emitting (VCSEL) laser providing an excitation wavelength at about 810 nm for ICG excitation.. In some embodiments, the excitation light source 910 can have one or more narrow-bandpass filters for the light sources to restrict their emission to a narrow wavelength band and block out-of-band wavelengths that may pass into the imager 700’ in the emission band. In some embodiments, the specimen 920 is not illuminated with multiple light sources simultaneously, to prevent cross-interference in the emission bands. Instead, the specimen 920 can be illuminated by multiple light sources individually. The lights can be multiplexed and separate images can be taken. For example, by using the three light sources as described above, a white light only image, a fluorescein only image, and an ICG only image can be obtained. Computer processing can enhance these images to bring out desired features, then the images may be merged to present a single image to an operator with the fluorescent features overlaid with highlights.
[0079] It should be understood that the optical filter 800A is used for illustration. The optical filter 800A can be replaced by the optical filter 800B or 800C.
[0080] While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are
provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It may be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.
Claims
1. An optical filter comprising: a first radial zone to receive a first plurality of light beams having a first incident angle range, the first radial zone comprising one or more blocking wavelength bands and one or more transmission wavelength bands; and a second radial zone to receive a second plurality of light beams having a second incident angle range greater than the first incident angle range, wherein the second radial zone maintains one or more excitation wavelength blocking characteristics within the one or more blocking wavelength bands and one or more emission wavelength transmittance characteristics within the one or more transmission wavelength bands across the first and second radial zones.
2. The optical filter of claim 1, wherein each of the first and second radial zones has excitation wavelength blocking characteristics satisfying a respective blocking characteristic threshold and the emission wavelength transmittance characteristics satisfying a respective transmittance characteristic threshold indicating that the excitation wavelength blocking characteristics and the emission wavelength transmittance characteristics are constant across the first and second radial zones for different incident angles over a wide field of view.
3. The optical filter of claim 1, wherein characteristics for each of the first and second radial zone are constant, and wherein the characteristics for each of the first and second radial zone are defined for a specific angular range.
4. The optical filter of claim 1, wherein characteristics of the first radial zone are different from characteristics of the second radial zone.
5. The optical filter of claim 1, wherein the first and second radial zones are discrete zones or continuously varying zones.
6. The optical filter of claim 1, further comprising:
a third radial zone to receive a third plurality of light beams having a third incident angle range greater than the second incident angle range, wherein the third radial zone maintains the one or more excitation wavelength transmittance characteristics and the one or more emission wavelength transmittance characteristics across the first, and second radial zones.
7. The optical filter of claim 6, further comprising: a fourth radial zone to receive a fourth plurality of light beams having a fourth incident angle range greater than the third incident angle range, wherein the fourth radial zone maintains the one or more excitation wavelength transmittance characteristics and the one or more emission wavelength transmittance characteristics across the first, second, and third radial zones.
8. The optical filter of claim 1, wherein the first incident angle range comprises incident angles of the first plurality of light beams incident upon the first radial zone in a range between about 0 degree and about 20 degrees, and the second incident angle range comprises incident angles of the second plurality of light beams incident upon the second radial zone in a range between about 20 degrees and about 35 degrees or more.
9. The optical filter of claim 1, wherein the one or more blocking wavelength bands comprise a first blocking wavelength band in a visible wavelength range and a second blocking wavelength band in a near infrared wavelength range.
10. The optical filter of claim 1, wherein a thickness of each of the first radial zone and the second radial zone is less than about 0.3 millimeters.
11. The optical filter of claim 1, wherein a diameter of the first radial zone is about 2.6 millimeters.
12. The optical filter of claim 1, wherein the optical filter receives light having wide incident angles transmitted by a lens assembly.
13. The optical filter of claim 12, wherein the wide incident angles are in a range between about -35 degrees or more to between +35 degrees or more.
14. The optical filter of claim 12, wherein the wide incident angles are in a range between about -60 degrees or more to between +60 degrees or more.
15. The optical filter of claim 1, wherein the optical filter is a spectral filter.
16. The optical filter of claim 1, wherein a portion of the first radial zone overlaps a portion of the second radial zone.
17. The optical filter of claim 16, wherein the overlapped portion includes characteristics of the first and second radial zones are filtered for.
18. The optical filter of claim 16, wherein the overlapped portion between the first and second radial zones is coated by a light blocking material.
19. The optical filter of claim 1, wherein an annular space is located between the first and second radial zones to prevent an overlap of the first and second radial zones.
20. The optical filter of claim 1, wherein the annular space is coated by a light blocking material.
21. A camera assembly for a wide field-of-view imaging, comprising: a lens assembly; an optical filter of any of claims 1-20; and one or more image sensors having a wide-angle field-of-view and configured to capture light in one or more selected wavelength bands transmitted from the optical filter.
22. A surgical robotic system, comprising: a light source; and a camera assembly of claim 21.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363541660P | 2023-09-29 | 2023-09-29 | |
| US63/541,660 | 2023-09-29 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025072724A1 true WO2025072724A1 (en) | 2025-04-03 |
Family
ID=93212008
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/048930 Pending WO2025072724A1 (en) | 2023-09-29 | 2024-09-27 | Wide-angle field-of-view fluorescent scene imager |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025072724A1 (en) |
Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08224210A (en) * | 1995-02-23 | 1996-09-03 | Olympus Optical Co Ltd | Fluorescence observing device |
| US20040186351A1 (en) * | 1996-11-20 | 2004-09-23 | Olympus Optical Co., Ltd. (Now Olympus Corporation) | Fluorescent endoscope system enabling simultaneous achievement of normal light observation based on reflected light and fluorescence observation based on light with wavelengths in infrared spectrum |
| JP3609854B2 (en) * | 1994-08-12 | 2005-01-12 | ペンタックス株式会社 | Lighting device having optical fiber |
| JP2006285214A (en) * | 2005-04-04 | 2006-10-19 | Ctx Opto Electronics Corp | Filtering device and optical lens apparatus |
| EP1720050A1 (en) * | 2004-02-09 | 2006-11-08 | Tamron Co., Ltd. | Chromatic aberration correction imaging optical system |
| US20180221102A1 (en) | 2017-02-09 | 2018-08-09 | Vicarious Surgical Inc. | Virtual reality surgical tools system |
| US20190076199A1 (en) | 2017-09-14 | 2019-03-14 | Vicarious Surgical Inc. | Virtual reality surgical camera system |
| US10285765B2 (en) | 2014-05-05 | 2019-05-14 | Vicarious Surgical Inc. | Virtual reality surgical device |
| US20210239891A1 (en) * | 2014-03-04 | 2021-08-05 | Stryker European Operations Limited | Spatial and spectral filtering apertures and optical imaging systems including the same |
| WO2021159409A1 (en) | 2020-02-13 | 2021-08-19 | Oppo广东移动通信有限公司 | Power control method and apparatus, and terminal |
| WO2021231402A1 (en) | 2020-05-11 | 2021-11-18 | Vicarious Surgical Inc. | System and method for reversing orientation and view of selected components of a miniaturized surgical robotic unit in vivo |
| WO2022094000A1 (en) | 2020-10-28 | 2022-05-05 | Vicarious Surgical Inc. | Laparoscopic surgical robotic system with internal degrees of freedom of articulation |
-
2024
- 2024-09-27 WO PCT/US2024/048930 patent/WO2025072724A1/en active Pending
Patent Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3609854B2 (en) * | 1994-08-12 | 2005-01-12 | ペンタックス株式会社 | Lighting device having optical fiber |
| JPH08224210A (en) * | 1995-02-23 | 1996-09-03 | Olympus Optical Co Ltd | Fluorescence observing device |
| US20040186351A1 (en) * | 1996-11-20 | 2004-09-23 | Olympus Optical Co., Ltd. (Now Olympus Corporation) | Fluorescent endoscope system enabling simultaneous achievement of normal light observation based on reflected light and fluorescence observation based on light with wavelengths in infrared spectrum |
| EP1720050A1 (en) * | 2004-02-09 | 2006-11-08 | Tamron Co., Ltd. | Chromatic aberration correction imaging optical system |
| JP2006285214A (en) * | 2005-04-04 | 2006-10-19 | Ctx Opto Electronics Corp | Filtering device and optical lens apparatus |
| US20210239891A1 (en) * | 2014-03-04 | 2021-08-05 | Stryker European Operations Limited | Spatial and spectral filtering apertures and optical imaging systems including the same |
| US10285765B2 (en) | 2014-05-05 | 2019-05-14 | Vicarious Surgical Inc. | Virtual reality surgical device |
| US20180221102A1 (en) | 2017-02-09 | 2018-08-09 | Vicarious Surgical Inc. | Virtual reality surgical tools system |
| US20190076199A1 (en) | 2017-09-14 | 2019-03-14 | Vicarious Surgical Inc. | Virtual reality surgical camera system |
| WO2021159409A1 (en) | 2020-02-13 | 2021-08-19 | Oppo广东移动通信有限公司 | Power control method and apparatus, and terminal |
| WO2021231402A1 (en) | 2020-05-11 | 2021-11-18 | Vicarious Surgical Inc. | System and method for reversing orientation and view of selected components of a miniaturized surgical robotic unit in vivo |
| WO2022094000A1 (en) | 2020-10-28 | 2022-05-05 | Vicarious Surgical Inc. | Laparoscopic surgical robotic system with internal degrees of freedom of articulation |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230255446A1 (en) | Surgical visualization systems and displays | |
| US20240382174A1 (en) | Surgical visualization systems | |
| US11147443B2 (en) | Surgical visualization systems and displays | |
| US20230122367A1 (en) | Surgical visualization systems and displays | |
| US12219228B2 (en) | Stereoscopic visualization camera and integrated robotics platform | |
| US20220054223A1 (en) | Surgical visualization systems and displays | |
| US10028651B2 (en) | Surgical visualization systems and displays | |
| US11154378B2 (en) | Surgical visualization systems and displays | |
| US12349860B2 (en) | Medical observation system, control device, and control method | |
| JP2023544360A (en) | Interactive information overlay on multiple surgical displays | |
| JP2023544594A (en) | Display control of layered systems based on capacity and user operations | |
| JP2023544593A (en) | collaborative surgical display | |
| WO2025072724A1 (en) | Wide-angle field-of-view fluorescent scene imager | |
| US20250344942A1 (en) | Multispectral imaging camera and methods of use | |
| WO2021049220A1 (en) | Medical support arm and medical system | |
| CN221888377U (en) | Lens assembly and endoscope camera system | |
| CN117643511A (en) | Light source host and endoscope camera system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24793924 Country of ref document: EP Kind code of ref document: A1 |