[go: up one dir, main page]

WO2025072724A1 - Imageur de scène fluorescente à champ de vue à grand angle - Google Patents

Imageur de scène fluorescente à champ de vue à grand angle Download PDF

Info

Publication number
WO2025072724A1
WO2025072724A1 PCT/US2024/048930 US2024048930W WO2025072724A1 WO 2025072724 A1 WO2025072724 A1 WO 2025072724A1 US 2024048930 W US2024048930 W US 2024048930W WO 2025072724 A1 WO2025072724 A1 WO 2025072724A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical filter
radial
wavelength
blocking
radial zone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/048930
Other languages
English (en)
Inventor
Michael Scott CAFFERTY
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vicarious Surgical Inc
Original Assignee
Vicarious Surgical Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vicarious Surgical Inc filed Critical Vicarious Surgical Inc
Publication of WO2025072724A1 publication Critical patent/WO2025072724A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00186Optical arrangements with imaging filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/208Filters for use with infrared or ultraviolet radiation, e.g. for separating visible light from infrared and/or ultraviolet radiation

Definitions

  • a process of fluorescence imaging can use a source of light of an excitation wavelength to illuminate a subject and stimulate an emission of light of another wavelength to make an image of features in a scene that are not easily visible under normal illumination.
  • the subject can be an unaltered specimen, or one that has been dyed with special fluorescent dye(s) to highlight specific features of the specimen.
  • fluorescent based imaging provides surgeons visualization of anatomy and tissue activity not visible through normal visualization.
  • ICG indocyanine green
  • One common form of fluorescence imaging used in surgery uses indocyanine green (ICG) dye which is injected into a patient’s bloodstream to image anatomical features and conditions such as tissue perfusion and blood flow. Multiple other dyes and autofluorescent capabilities allow potential different visualizations behaviors that aid surgeons in targeting the correct tissue to dissect or avoid.
  • ICG indocyanine green
  • a fluorescence imaging camera can be used for fluorescence imaging.
  • the fluorescence imaging camera selectively blocks much stronger excitation light wavelength (background) that reflects back from the subject and transmits the weaker emission wavelength (signal) from the subject to create a useful image with a high signal -to- background ratio.
  • Conventional filters can perform a selective blocking function to separate the emission and excitation.
  • filters do not ensure that the selective blocking function is maintained appropriately across a wide-angle field of view of the fluorescence imaging camera due to their limited incident angle ranges. That is, in a wide-angle field of view more light at the excitation wavelength outside the limited incident angle range can pass through the filters. This results in poor performance, such as reducing a signal-to-background ratio, obscuring a desired image, and the like.
  • the present disclosure provides a multiple zone spectral optical filter (also referred to as “optical filter”).
  • the optical filter can include a first radial zone to receive a first plurality of light beams having a first incident angle range.
  • the first zone can include one or more blocking wavelength bands and one or more transmission wavelength bands.
  • the optical filter can further include a second radial zone to receive a second plurality of light beams having a second incident angle range greater than the first incident angle range.
  • the second radial zone can maintain one or more excitation wavelength blocking characteristics within the one or more blocking wavelength bands and one or more emission wavelength transmittance characteristics within the one or more transmission wavelength bands across the first and second zones.
  • the second radial zone can have the same excitation wavelength blocking characteristics and the emission wavelength transmittance characteristics as the first zone.
  • a change in the excitation wavelength blocking characteristics and/or the emission wavelength transmittance characteristics between the first zone and the second zone can satisfy a change threshold describing a value or a value range indicative of a difference among different radial zones being less than a value or falling within a value range.
  • the optical filter can include a third radial zone to receive a third plurality of light beams having a third incident angle range greater than the second incident angle range, wherein the third radial zone maintains the one or more excitation wavelength transmittance characteristics and the one or more emission wavelength transmittance characteristics across the first, and second radial zones.
  • the optical filter can further include a fourth radial zone to receive a fourth plurality of light beams having a fourth incident angle range greater than the third incident angle range, wherein the fourth radial zone maintains the one or more excitation wavelength transmittance characteristics and the one or more emission wavelength transmittance characteristics across the first, second, and third radial zones.
  • a multiple zone spectral optical filter as taught herein can be employed as part of a camera assembly.
  • the present disclosure provides a camera assembly for a wide-angle field-of-view imaging.
  • the camera assembly can include a lens assembly and an optical filter as taught herein.
  • the optical filter can receive light having wide incident angles transmitted by the lens assembly.
  • the camera assembly can further include one or more image sensors having a wide-angle field-of-view and configured to capture light in one or more selected wavelength bands transmitted from the optical filter.
  • a multiple zone spectral optical filter as taught herein can be employed as part of a laparoscope or surgical robotic system.
  • the present disclosure provides a robotic surgical system.
  • the robotic surgical system can include a light source and a camera assembly having a multiple zone spectral optical filter as taught herein.
  • the robotic surgical system can perform multispectral imaging.
  • multispectral illumination with excitation light in multiple wavelength ranges e.g., in the visible, near-infrared, and/or infrared spectra
  • detection of emission light in multiple wavelength ranges e.g., in the visible, near-infrared, and/or infrared spectra
  • the camera assembly can be a multispectral camera assembly that enables simultaneous imaging of non-visible light (e.g., near-infrared and/or infrared fluorescence) and visible light (e.g., visible light fluorescence) of an internal body space.
  • FIG. l is a diagram illustrating an example surgical robotic system in accordance with some embodiments.
  • FIG. 2B is an example perspective view of an example operator console of a surgical robotic system of the present disclosure in accordance with some embodiments.
  • FIG. 3 A is a diagram illustrating an example side view of a surgical robotic system performing a surgery within an internal cavity of a subject in accordance with some embodiments.
  • FIG. 3B is a diagram illustrating an example top view of the surgical robotic system performing the surgery within the internal cavity of the subject of FIG. 3 A in accordance with some embodiments.
  • FIG. 4A is an example perspective view of a single robotic arm subsystem in accordance with some embodiments.
  • FIG. 4B is an example perspective side view of a single robotic arm of the single robotic arm subsystem of FIG. 4A in accordance with some embodiments.
  • FIG. 5 is an example perspective front view of a camera assembly and a robotic arm assembly in accordance with some embodiments.
  • FIG. 6 illustrates transmission spectra of an optical filter with spatially uniform characteristics for incident angles of 0 and 30 degrees, respectively.
  • FIG. 8B illustrates an optical filter having two discrete radial zones in accordance with some embodiments.
  • FIG. 8C illustrates an optical filter having continuously-varying radial zones in accordance with some embodiments.
  • FIG. 9 illustrates a fluorescence imaging system using the optical filter illustrated in FIG. 8A in accordance with some embodiments.
  • Fluorescence can help visualize blood vessels, ureters, cancer, nerves, tissue perfusion, amongst other tissue types and anatomical features. All types of fluorescence like dye, autofluorescence, and other types of differential visualization may be paired with a fluorescence imaging system.
  • the fluorescence imaging system can employ filters on a camera assembly that selectively block specific frequencies of emitted light.
  • Such imaging system can use wide-angle lenses that can image large areas by collecting light at large angles, for example, between about -60 degrees or more to between +60 degrees or more. The light rays travel through the wide-angle lenses at large angles and also impinge upon an image sensor at large angles.
  • multilayer filters to perform a selective blocking function to separate emission wavelengths and excitation wavelengths.
  • multilayer filters can be sensitive to incident angles. Often, they can block excitation wavelengths at a limited incident angle range, such as between about -20 degrees and about +20 degrees. Outside of the limited incident angle range, the multilayer filters can allow more light at the excitation wavelength to pass through, reducing a signal-to-background ratio, and obscuring a desired image.
  • the incident angle can be increased a little, but at the cost of multilayer complexity and use of more exotic materials and higher processing costs, which may not be suitable for surgical devices (e.g., surgical endoscopes, or the like).
  • conventional optical filters can have spatially uniform characteristics (e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like are spatially uniform across an entire optical fitter) such that excitation wavelength blocking characteristics and emission wavelength transmittance characteristics vary for different incident angles. For example, as incident angles increase, the filter characteristics of conventional optical filters change with features moving to shorter or longer wavelengths. The blocking band for the excitation wavelength starts to transmit more, and the transmission band for the emission wavelength starts to transmit less. The signal to background ratio decreases, compromising the quality of the fluorescence image.
  • spatially uniform characteristics e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like are spatially uniform across an entire optical fitter
  • the present disclosure provides an optical filter allowing fluorescence imaging over a wide field of view with compact and inexpensive optics needed for surgical endoscopy.
  • the optical filter as taught herein can include multiple radial zones to receive light beams incident on the zones over broad incident angle ranges (e.g., between about -35 degrees or more and about +35 degrees or more, between about -60 degrees or more and about +60 degrees or more, or the like).
  • characteristics e.g., coating, the number of layers, layer thickness, shape, dimensions, materials, and/or the like
  • FIGS. 8A and 8B examples are described with respect to FIGS. 8A and 8B.
  • the characteristics for each radial zone can be different, but constant within the same radial zone.
  • transmission and blocking bands move with the incident angles as described above where a blocking band for the excitation wavelength starts to transmit more, and a transmission band for the emission wavelength starts to transmit less, but characteristics of each radial zone can be adjusted so that movements of transmission and blocking bands can be tolerable for a corresponding wavelength range that each radial zone is designed for, thereby maintaining excitation wavelength blocking characteristics and emission wavelength transmittance characteristics across multiple radial zones over a broad incident angle range (e.g., between about -35 degrees or more and about +35 degrees or more, between about -60 degrees or more and about +60 degrees or more, or the like).
  • controller can refer to a hardware device that includes a memory and a processor and is specifically programmed to execute the processes described herein in accordance with some embodiments.
  • the memory is configured to store the modules and the processor is specifically configured to execute said modules to perform one or more processes which are described further below.
  • multiple different controllers or controllers or multiple different types of controllers or controllers can be employed in performing one or more processes.
  • different controllers or controllers can be implemented in different portions of a surgical robotic systems.
  • the surgical robotic module can include multiple different submodules or parts that can be inserted into the trocar separately.
  • the surgical robotic module, surgical robotic module or robotic assembly can include multiple separate robotic arms that are deployable within the patient along different or separate axes. These multiple separate robotic arms can be collectively referred to as a robotic arm assembly herein.
  • a surgical camera assembly can also be deployed along a separate axis.
  • the surgical robotic module, surgical robotic module, or robotic assembly can also include the surgical camera assembly.
  • the surgical robotic module, or robotic assembly employs multiple different components, such as a pair of robotic arms and a surgical or robotic camera assembly, each of which are deployable along different axes and are separately manipulatable, maneuverable, and movable.
  • the robot support system can be directly mounted to a surgical table or to the floor or ceiling within an operating room. In another embodiment, the mounting is achieved by various fastening means, including but not limited to, clamps, screws, or a combination thereof. In other embodiments, the structure can be free standing.
  • the robot support system can mount a motor assembly that is coupled to the surgical robotic module, which includes the robotic arm assembly and the camera assembly.
  • the motor assembly can include gears, motors, drivetrains, electronics, and the like, for powering the components of the surgical robotic module.
  • the robotic arm assembly and the camera assembly are capable of multiple degrees of freedom of movement.
  • the sensors of the sensing and tracking module 16 can be coupled to a region of the hand and/or the arm, such as the fingers, the wrist region, the elbow region, and/or the shoulder region. Additional sensors can also be coupled to a head and/or neck region of the operator in some embodiments.
  • the sensing and tracking module 16 can be external and coupled to the hand controllers 17 via electricity components and/or mounting hardware.
  • the optional sensor and tracking module 16A can sense and track movement of one or more of an operator’s head, of at least a portion of an operator’s head, an operator’s eyes or an operator’s neck based, at least in part, on imaging of the operator in addition to or instead of by a sensor or sensors attached to the operator’s body.
  • the trocar 50 is a medical device that can be made up of an awl (which can be a metal or plastic sharpened or non-bladed tip), a cannula (essentially a hollow tube), and a seal in some embodiments.
  • the trocar 50 can be used to place at least a portion of the robotic subsystem 20 in an interior cavity of a subject (e.g., a patient) and can withdraw gas and/or fluid from a body cavity.
  • the robotic subsystem 20 can be inserted through the trocar 50 to access and perform an operation in vivo in a body cavity of a patient.
  • the robotic subsystem 20 can be supported, at least in part, by the trocar 50 or a trocar mount with multiple degrees of freedom such that the robotic arm assembly 42 and the camera assembly 44 can be maneuvered within the patient into a single position or multiple different positions.
  • the robotic arm assembly 42 and camera assembly 44 can be moved with respect to the trocar 50 or a trocar mount with multiple different degrees of freedom such that the robotic arm assembly 42 and the camera assembly 44 can be maneuvered within the patient into a single position or multiple different positions.
  • the RSS 46 can further include an optional controller for processing input data from one or more of the system components (e.g., the display 12, the sensing and tracking module 16, the robotic arm assembly 42, the camera assembly 44, and the like), and for generating control signals in response thereto.
  • the motor 40 can also include a storage element for storing data in some embodiments.
  • the robotic arm assembly 42 can also have associated therewith end regions that can terminate in end-effectors that follow the movement of one or more fingers of the operator in some embodiments, such as for example the index finger as the user pinches together the index finger and thumb.
  • the robotic arm assembly 42 can follow movement of the arms of the operator in some modes of control while a virtual chest of the robotic assembly can remain stationary (e.g., in an instrument control mode).
  • the position and orientation of the torso of the operator are subtracted from the position and orientation of the operator’s arms and/or hands. This subtraction allows the operator to move his or her torso without the robotic arms moving. Further disclosure control of movement of individual arms of a robotic assembly is provided in International Patent Application Publications WO 2022/094000 Al and WO 2021/231402 Al, each of which is incorporated by reference herein in its entirety.
  • the camera assembly 44 is configured to provide the operator with image data 48, such as for example a live video feed of an operation or surgical site, as well as enable the operator to actuate and control the cameras forming part of the camera assembly 44.
  • the camera assembly 44 can include one or more cameras (e.g., a pair of cameras), the optical axes of which are axially spaced apart by a selected distance, known as the inter-camera distance, to provide a stereoscopic view or image of the surgical site.
  • the operator can control the movement of the cameras via movement of the hands via sensors coupled to the hands of the operator or via hand controllers 17 grasped or held by hands of the operator, thus enabling the operator to obtain a desired view of an operation site in an intuitive and natural manner.
  • the operator can additionally control the movement of the camera via movement of the operator’s head.
  • the camera assembly 44 is movable in multiple directions, including for example in yaw, pitch and roll directions relative to a direction of view.
  • the components of the stereoscopic cameras can be configured to provide a user experience that feels natural and comfortable.
  • the interaxial distance between the cameras can be modified to adjust the depth of the operation site perceived by the operator.
  • the image or video data 48 generated by the camera assembly 44 can be displayed on the display 12.
  • the display 12 includes an HMD
  • the display can include the built-in sensing and tracking module 16A that obtains raw orientation data for the yaw, pitch and roll directions of the HMD as well as positional data in Cartesian space (x, y, z) of the HMD.
  • positional and orientation data regarding an operator’s head can be provided via a separate head-tracking module.
  • the sensing and tracking module 16A can be used to provide supplementary position and orientation tracking data of the display in lieu of or in addition to the built-in tracking system of the HMD.
  • no head tracking of the operator is used or employed.
  • images of the operator can be used by the sensing and tracking module 16A for tracking at least a portion of the operator’s head.
  • FIG. 2A depicts an example robotic assembly 20, which is also referred to herein as a robotic subsystem, of a surgical robotic system 10 incorporated into or mounted onto a mobile patient cart in accordance with some embodiments.
  • the robotic subsystem 20 includes the RSS 46, which, in turn includes the motor 40, the robotic arm assembly 42 having end-effectors 45, the camera assembly 44 having one or more cameras 47, and can also include the trocar 50 or a trocar mount.
  • FIG. 2B depicts an example of an operator console 11 of the surgical robotic system 10 of the present disclosure in accordance with some embodiments.
  • the operator console 11 includes a display 12, hand controllers 17, and also includes one or more additional controllers, such as a foot pedal array 19 for control of the robotic arm assembly 42, for control of the camera assembly 44, and for control of other aspects of the system.
  • FIG. 2B also depicts the left hand controller subsystem 23 A and the right hand controller subsystem 23B of the operator console.
  • the left hand controller subsystem 23 A includes and supports the left hand controller 17A
  • the right hand controller subsystem 23B includes and supports the right hand controller 17B.
  • the left hand controller subsystem 23 A can releasably connect to or engage the left hand controller 17A
  • right hand controller subsystem 23B can releasably connect to or engage the right hand controller 17A
  • the connections can be both physical and electronic so that the left hand controller subsystem 23 A and the right hand controller subsystem 23B can receive signals from the left hand controller 17A and the right hand controller 17B, respectively, including signals that convey inputs received from a user selection on a button or touch input device of the left hand controller 17A or the right hand controller 17B.
  • Each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B can include components that enable a range of motion of the respective left hand controller 17A and right hand controller 17B, so that the left hand controller 17A and right hand controller 17B can be translated or displaced in three dimensions and can additionally move in the roll, pitch, and yaw directions. Additionally, each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B can register movement of the respective left hand controller 17A and right hand controller 17B in each of the forgoing directions and can send a signal providing such movement information to the processor 22 (as illustrated in FIG. 1) of the surgical robotic system 10.
  • each of the left hand controller subsystem 23 A and the right hand controller subsystem 23B can be configured to receive and connect to or engage different hand controllers (not illustrated).
  • hand controllers with different configurations of buttons and touch input devices can be provided.
  • hand controllers with a different shape can be provided. The hand controllers can be selected for compatibility with a particular surgical robotic system or a particular surgical robotic procedure or selected based upon preference of an operator with respect to the buttons and input devices or with respect to the shape of the hand controller in order to provide greater comfort and ease for the operator.
  • FIG. 3 A schematically depicts a side view of the surgical robotic system 10 performing a surgery within an internal cavity 104 of a subject 100 in accordance with some embodiments and for some surgical procedures.
  • FIG. 3B schematically depicts a top view of the surgical robotic system 10 performing the surgery within the internal cavity 104 of the subject 100.
  • the subject 100 e.g., a patient
  • an operation table 102 e.g., a surgical table 102
  • an incision is made in the patient 100 to gain access to the internal cavity 104.
  • the trocar 50 is then inserted into the patient 100 at a selected location to provide access to the internal cavity 104 or operation site.
  • the RSS 46 can then be maneuvered into position over the patient 100 and the trocar 50.
  • the RSS 46 includes a trocar mount that attaches to the trocar 50.
  • the camera assembly 44 and the robotic arm assembly 42 can be coupled to the motor 40 and inserted individually and/or sequentially into the patient 100 through the trocar 50 and hence into the internal cavity 104 of the patient 100.
  • references to insertion of the robotic arm assembly 42 and/or the camera assembly 44 into an internal cavity of a subject and disposing the robotic arm assembly 42 and/or the camera assembly 44 in the internal cavity of the subject are referring to the portions of the robotic arm assembly 42 and the camera assembly 44 that are intended to be in the internal cavity of the subject during use.
  • the sequential insertion method has the advantage of supporting smaller trocars and thus smaller incisions can be made in the patient 100, thus reducing the trauma experienced by the patient 100.
  • the camera assembly 44 and the robotic arm assembly 42 can be inserted in any order or in a specific order.
  • the camera assembly 44 can be followed by a first robotic arm 42A of the robotic arm assembly 42 and then followed by a second robotic arm 42B of the robotic arm assembly 42 all of which can be inserted into the trocar 50 and hence into the internal cavity 104.
  • the RSS 46 can move the robotic arm assembly 42 and the camera assembly 44 to an operation site manually or automatically controlled by the operator console 11.
  • FIG. 4A is a perspective view of a robotic arm subassembly 21 in accordance with some embodiments.
  • the robotic arm subassembly 21 includes a robotic arm 42 A, the endeffector 45 having an instrument tip 120 (e.g., monopolar scissors, needle driver/holder, bipolar grasper, or any other appropriate tool), a shaft 122 supporting the robotic arm 42 A.
  • an instrument tip 120 e.g., monopolar scissors, needle driver/holder, bipolar grasper, or any other appropriate tool
  • a distal end of the shaft 122 is coupled to the robotic arm 42A, and a proximal end of the shaft 122 is coupled to a housing 124 of the motor 40 (as illustrated in FIG. 2 A). At least a portion of the shaft 122 can be external to the internal cavity 104 (as illustrated in FIGS. 3A and 3B). At least a portion of the shaft 122 can be inserted into the internal cavity 104 (as illustrated in FIGS. 3A and 3B).
  • FIG. 4B is a side view of the robotic arm assembly 42.
  • the robotic arm assembly 42 includes a shoulder joint 126 forming a virtual shoulder, an elbow joint 128 having position sensors 132 (e.g., capacitive proximity sensors) and forming a virtual elbow, a wrist joint 130 forming a virtual wrist, and the end-effector 45 in accordance with some embodiments.
  • the shoulder joint 126, the elbow joint 128, the wrist joint 130 can include a series of hinge and rotary joints to provide each arm with positionable, seven degrees of freedom, along with one additional grasping degree of freedom for the end-effector 45 in some embodiments.
  • the surgical robotic system 10 as a whole has nine degrees of freedom.
  • FIG. 5 illustrates a perspective front view of a portion of the robotic assembly 20 configured for insertion into an internal body cavity of a patient.
  • the robotic assembly 20 includes a robotic arm 42 A and a robotic arm 42B.
  • the two robotic arms 42 A and 42B can define, or at least partially define, a virtual chest 140 of the robotic assembly 20 in some embodiments.
  • the virtual chest 140 (depicted as a triangle with dotted lines) can be defined by a chest plane extending between a first pivot point 142A of a most proximal joint of the robotic arm 42A (e.g., a shoulder joint 126), a second pivot point 142B of a most proximal joint of the robotic arm 42B, and a camera imaging center point 144 of the camera(s) 47.
  • a pivot center 146 of the virtual chest 140 lies in the middle of the virtual chest 140.
  • sensors in one or both of the robotic arm 42A and the robotic arm 42B can be used by the surgical robotic system 10 to determine a change in location in three-dimensional space of at least a portion of each or both of the robotic arms 42 A and 42B.
  • sensors in one or both of the first robotic arm 42A and second robotic arm 42B can be used by the surgical robotic system 10 to determine a location in three- dimensional space of at least a portion of one robotic arm relative to a location in three- dimensional space of at least a portion of the other robotic arm.
  • the camera assembly 44 is configured to obtain images from which the surgical robotic system 10 can determine relative locations in three-dimensional space.
  • the camera assembly 44 can include multiple cameras, at least two of which are laterally displaced from each other relative to an imaging axis, and the system can be configured to determine a distance to features within the internal body cavity.
  • a surgical robotic system including camera assembly and associated system for determining a distance to features can be found in International Patent Application Publication No. WO 2021/159409, entitled “System and Method for Determining Depth Perception In Vivo in a Surgical Robotic System,” and published August 12, 2021, which is incorporated by reference herein in its entirety.
  • Information about the distance to features and information regarding optical properties of the cameras can be used by a system to determine relative locations in three-dimensional space.
  • FIG. 6 illustrates transmission spectra 610 and 620 of a conventional optical filter with spatially uniform characteristics for incident angles of 0 and 30 degrees, respectively.
  • Line 630 depicts ICG excitation and line 640 depicts ICG emission.
  • Line 650 depicts fluorescein excitation and line 660 depicts fluorescein emission.
  • the blocking band e.g., band having about 0% transmittance percentage
  • the transmission band e.g., band having about 60% or more transmittance percentage
  • a signal to background ratio decreases, compromising the quality of the fluorescence image.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Toxicology (AREA)
  • General Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne un filtre optique (800A) pour un imageur de scène fluorescente à champ de vue à grand angle. Par exemple, un filtre optique (800A) peut comprendre une première zone radiale (810A) pour recevoir une première pluralité de faisceaux lumineux avec une première plage d'angles incidents. La première zone radiale (810A) peut comprendre une ou plusieurs bandes de longueurs d'onde de blocage et une ou plusieurs bandes de longueurs d'onde de transmission. Le filtre optique (800A) peut en outre comprendre une seconde zone radiale (810B) configurée pour recevoir une seconde pluralité de faisceaux lumineux avec une seconde plage d'angles d'incidence supérieure à la première plage d'angles d'incidence. La seconde zone radiale (810B) peut maintenir une ou plusieurs caractéristiques de blocage de longueur d'onde d'excitation dans la ou les bandes de longueurs d'onde de blocage, et une ou plusieurs caractéristiques de transmittance de longueur d'onde d'émission à l'intérieur de la ou des bandes de longueurs d'onde de transmission à travers les première et seconde zones radiales.
PCT/US2024/048930 2023-09-29 2024-09-27 Imageur de scène fluorescente à champ de vue à grand angle Pending WO2025072724A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363541660P 2023-09-29 2023-09-29
US63/541,660 2023-09-29

Publications (1)

Publication Number Publication Date
WO2025072724A1 true WO2025072724A1 (fr) 2025-04-03

Family

ID=93212008

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/048930 Pending WO2025072724A1 (fr) 2023-09-29 2024-09-27 Imageur de scène fluorescente à champ de vue à grand angle

Country Status (1)

Country Link
WO (1) WO2025072724A1 (fr)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08224210A (ja) * 1995-02-23 1996-09-03 Olympus Optical Co Ltd 蛍光観察装置
US20040186351A1 (en) * 1996-11-20 2004-09-23 Olympus Optical Co., Ltd. (Now Olympus Corporation) Fluorescent endoscope system enabling simultaneous achievement of normal light observation based on reflected light and fluorescence observation based on light with wavelengths in infrared spectrum
JP3609854B2 (ja) * 1994-08-12 2005-01-12 ペンタックス株式会社 光ファイバーを有する照明装置
JP2006285214A (ja) * 2005-04-04 2006-10-19 Ctx Opto Electronics Corp フィルタリング装置と光学レンズ装置
EP1720050A1 (fr) * 2004-02-09 2006-11-08 Tamron Co., Ltd. Systeme optique d'imagerie avec correction des aberrations chromatiques
US20180221102A1 (en) 2017-02-09 2018-08-09 Vicarious Surgical Inc. Virtual reality surgical tools system
US20190076199A1 (en) 2017-09-14 2019-03-14 Vicarious Surgical Inc. Virtual reality surgical camera system
US10285765B2 (en) 2014-05-05 2019-05-14 Vicarious Surgical Inc. Virtual reality surgical device
US20210239891A1 (en) * 2014-03-04 2021-08-05 Stryker European Operations Limited Spatial and spectral filtering apertures and optical imaging systems including the same
WO2021159409A1 (fr) 2020-02-13 2021-08-19 Oppo广东移动通信有限公司 Procédé et appareil de commande de puissance, et terminal
WO2021231402A1 (fr) 2020-05-11 2021-11-18 Vicarious Surgical Inc. Système et procédé d'inversion d'orientation et de visualisation de composants sélectionnés d'une unité robotique chirurgicale miniaturisée in vivo
WO2022094000A1 (fr) 2020-10-28 2022-05-05 Vicarious Surgical Inc. Système robotique chirurgical laparoscopique présentant des degrés de liberté internes d'articulation

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3609854B2 (ja) * 1994-08-12 2005-01-12 ペンタックス株式会社 光ファイバーを有する照明装置
JPH08224210A (ja) * 1995-02-23 1996-09-03 Olympus Optical Co Ltd 蛍光観察装置
US20040186351A1 (en) * 1996-11-20 2004-09-23 Olympus Optical Co., Ltd. (Now Olympus Corporation) Fluorescent endoscope system enabling simultaneous achievement of normal light observation based on reflected light and fluorescence observation based on light with wavelengths in infrared spectrum
EP1720050A1 (fr) * 2004-02-09 2006-11-08 Tamron Co., Ltd. Systeme optique d'imagerie avec correction des aberrations chromatiques
JP2006285214A (ja) * 2005-04-04 2006-10-19 Ctx Opto Electronics Corp フィルタリング装置と光学レンズ装置
US20210239891A1 (en) * 2014-03-04 2021-08-05 Stryker European Operations Limited Spatial and spectral filtering apertures and optical imaging systems including the same
US10285765B2 (en) 2014-05-05 2019-05-14 Vicarious Surgical Inc. Virtual reality surgical device
US20180221102A1 (en) 2017-02-09 2018-08-09 Vicarious Surgical Inc. Virtual reality surgical tools system
US20190076199A1 (en) 2017-09-14 2019-03-14 Vicarious Surgical Inc. Virtual reality surgical camera system
WO2021159409A1 (fr) 2020-02-13 2021-08-19 Oppo广东移动通信有限公司 Procédé et appareil de commande de puissance, et terminal
WO2021231402A1 (fr) 2020-05-11 2021-11-18 Vicarious Surgical Inc. Système et procédé d'inversion d'orientation et de visualisation de composants sélectionnés d'une unité robotique chirurgicale miniaturisée in vivo
WO2022094000A1 (fr) 2020-10-28 2022-05-05 Vicarious Surgical Inc. Système robotique chirurgical laparoscopique présentant des degrés de liberté internes d'articulation

Similar Documents

Publication Publication Date Title
US20230255446A1 (en) Surgical visualization systems and displays
US20240382174A1 (en) Surgical visualization systems
US11147443B2 (en) Surgical visualization systems and displays
US20230122367A1 (en) Surgical visualization systems and displays
US12219228B2 (en) Stereoscopic visualization camera and integrated robotics platform
US20220054223A1 (en) Surgical visualization systems and displays
US10028651B2 (en) Surgical visualization systems and displays
US11154378B2 (en) Surgical visualization systems and displays
US12349860B2 (en) Medical observation system, control device, and control method
JP2023544360A (ja) 複数の外科用ディスプレイ上へのインタラクティブ情報オーバーレイ
JP2023544594A (ja) 容量及びユーザ操作に基づく階層化されたシステムの表示制御
JP2023544593A (ja) 協働的外科用ディスプレイ
WO2025072724A1 (fr) Imageur de scène fluorescente à champ de vue à grand angle
US20250344942A1 (en) Multispectral imaging camera and methods of use
WO2021049220A1 (fr) Bras de support médical et système médical
CN221888377U (zh) 镜头组件及内窥镜摄像系统
CN117643511A (zh) 光源主机及内窥镜摄像系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24793924

Country of ref document: EP

Kind code of ref document: A1