WO2025160560A1 - Système chirurgical compact guidé par image - Google Patents
Système chirurgical compact guidé par imageInfo
- Publication number
- WO2025160560A1 WO2025160560A1 PCT/US2025/013227 US2025013227W WO2025160560A1 WO 2025160560 A1 WO2025160560 A1 WO 2025160560A1 US 2025013227 W US2025013227 W US 2025013227W WO 2025160560 A1 WO2025160560 A1 WO 2025160560A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- lens
- optical system
- beam splitter
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N21/645—Specially adapted constructive features of fluorimeters
- G01N21/6456—Spatial resolved fluorescence measurements; Imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
- G03B15/03—Combinations of cameras with lighting apparatus; Flash units
- G03B15/05—Combinations of cameras with electronic flash apparatus; Electronic flash units
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/14—Special procedures for taking photographs; Apparatus therefor for taking photographs during medical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
- A61B2090/306—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
- A61B2090/309—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using white LEDs
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/373—Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N21/6428—Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes"
- G01N2021/6439—Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes" with indicators, stains, dyes, tags, labels, marks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2201/00—Features of devices classified in G01N21/00
- G01N2201/02—Mechanical
- G01N2201/022—Casings
- G01N2201/0221—Portable; cableless; compact; hand-held
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2201/00—Features of devices classified in G01N21/00
- G01N2201/06—Illumination; Optics
- G01N2201/063—Illuminating optical parts
- G01N2201/0635—Structured illumination, e.g. with grating
Definitions
- This patent document is generally related visible light and infrared imaging and pattern generation techniques.
- NIR fluorescence from a target is of interest in the medical field. Techniques that enable such visualizing can be of assistance during surgical procedures and lead to improvements in the way these procedures are carried out.
- the disclosed embodiments relate to devices, systems and associated methods that, among other features and benefits, produce images obtained via NIR fluorescence and project patterns associated with the images onto a sample, such as a biological tissue, using visible light.
- a sample such as a biological tissue
- visible light e.g., a laser beam
- a surgeon can maintain a fixed gaze on the surgical field rather than splitting his/her attention between the patient and an external display.
- the disclosed devices can be implemented as a handheld device or a head-mount device which allows a person to place the device on his/her head.
- FIG. 1 shows a diagram of an example device based on the disclosed technology.
- FIGS. 2A-C shows example microLED light sources, that can be implemented in accordance with embodiments of the disclosed technology, and example features thereof.
- FIGS. 3A-C show an image and schematic representations of an example handheld device based on the disclosed technology.
- FIGS. 4A-D show example fluorescence images and patterns projected onto an object using visible light in accordance with implementations of the disclosed technology.
- FIG. 5 shows an example layout of an optical system based on the disclosed technology.
- FIG. 6 shows a schematic of an example optical system based on the disclosed technology that can be implemented as part of a head-mounted device.
- FIG. 7 shows schematics of example lens system that can be implemented in embodiments based on the disclosed technology.
- FIG. 8 shows a flow diagram of an example method based on the disclosed technology.
- FIG. 1 illustrates a diagram of an example device based on the disclosed technology.
- a NIR illumination laser source with an operating wavelength of 785 nm is used to illuminate the sample that has been treated or otherwise tagged with a dye such as Indocyanine green (ICG).
- ICG Indocyanine green
- the fluorescence light produced by the sample is captured by the lens and after optional filtering is imaged at the camera.
- the captured image is processed, and the useful information is transferred to the image source, such as a microLED panel, an organic light-emitting diode (OLED), liquid crystal on silicon (LCoS) device, digital micromirror device (DMD), or laser scanner.
- OLED organic light-emitting diode
- LCD liquid crystal on silicon
- DMD digital micromirror device
- a pattern associated with the captured image is projected onto the sample by the microLED display using visible light, such as green light. Without using the visible projection light, the section of the sample of interest would not be visible to human eye (it is in the IR range). But using the disclosed visible-light projection technique the imaged pattern is projected back onto the sample and is visible.
- the microLED in this example, operating at a wavelength of 520 nm
- the microLED is compact in size and allows implementation of the disclosed technology in a compact form, such as in a head mounted device.
- the operations associated with imaging and projection of FIG. 1 can be repeated at the video frame rate to enable continuous capture and projection of images at, for example, 30 frames per second.
- acquiring the image and projecting the pattern can be performed alternatively and repeatedly at a particular rate. This can be advantageous if there is a change in the features imaged in the infrared (e.g., blood clot that moves around, etc.).
- FIGS. 2A-C illustrate examples of commercially available microLED light sources, and some of their features and benefits.
- the microLED light sources may be compact, as shown in FIG. 2A using a reference pencil for scale.
- the microLED light sources may be formed as an array.
- FIG. 2B shows an example of the array which may have dimensions of 1.92 mm x 2.56 mm or smaller.
- FIG. 2C shows an example of a pattern, having dimensions of 9.6 mm x 7.2 mm, which can be projected using visible light generated by microLED light sources.
- FIGS. 3A-C illustrate an example handheld device based on the disclosed technology, including some of its features, components and method of operation as a handheld device.
- FIG. 3A-C illustrate an example handheld device based on the disclosed technology, including some of its features, components and method of operation as a handheld device.
- the handheld device includes a housing to enclose various elements of an optical system or device based on the disclosed technology, such as the example device previously described with reference to FIG. 1.
- the housing may be 3D printed.
- the overall dimensions of the handheld device are 14.1 cm x 9.0 cm x 7.0 cm.
- FIG. 3 A provides an example of how microLED light sources, a camera, and additional optical elements may be disposed within the housing.
- the additional optical elements may include, but are not limited to, filters, light sources, beam splitters, and lenses configured in various arrangements.
- the handheld device includes optical paths for illumination, imaging, and projection.
- the housing may include one or more openings to allow light to propagate into or out of the handheld device. For example, input light may be received at an input of the device via a cable (far right in FIG. 3A).
- FIG. 3B shows a cross section (viewed from top) of the example handheld device, which includes an input for receiving incident light from a light source, such as fiber for delivery of a laser light, and a light pipe configured to direct the laser light towards optics disposed within the housing and improve illumination uniformity.
- the incident laser light includes infrared light.
- the infrared light exits the handheld device through an opening (top opening on the left side of FIG. 3B) to enable illumination of an object with the infrared light.
- a user may operate the handheld device by aiming the device at a location on an object and activating the device to illuminate the object. This method of operation is illustrated in FIG. 3C.
- the housing also includes another opening (lower opening on left side of FIG.
- the handheld device includes a processing unit in communication with the camera and the microLED sources.
- the processing unit is capable of extracting information from the image and transmitting the information to the microLED sources.
- the processing unit is configured to cause some or all of the microLED sources to illuminate based on the extracted information from the image.
- a pattern associated with the captured image is projected onto the object by the microLED sources using visible light.
- the microLED sources can output visible light in various colors. As shown in FIG. 3C, the visible light used for projection may exit the handheld device from the same opening which is used for imaging the object.
- Table I below provides example parameters, and their associated values, of the example handheld device shown in FIGS. 3A-C.
- FIG. 4A shows an example of a fluorescence image obtained from a stained area of an object in accordance with implementations of the disclosed technology.
- FIG. 4B shows a threshold image obtained based on the first image.
- the disclosed embodiments can be used to project the image onto a target (in this case the sample was a swine liver).
- Various projection patterns are possible, as demonstrated by the checkerboard pattern shown in FIG. 4C. The results indicate that the microLED display is sufficiently bright and can project an image that can be readily seen by an observer.
- FIG. 5 illustrates an example optical system layout for a disclosed device.
- FIG. 5 is explained in the context of the device shown in FIG. 1.
- the optical axis of the illumination path is at an angle with respect to the optical axis of the imaging and display system. Accordingly, the target area of the sample may not be fully illuminated unless the illumination source and/or the sample are positioned at an appropriate distance from each other. This can be problematic for the device.
- the configuration of FIG. 5 overcomes this problem by aligning the optical axis of the illumination path with that of the imaging and display system.
- a beam splitter (which can be a dichroic mirror) is used to combine optical axes of the illumination and imaging/display system. With this configuration, illumination field and imaging/display field always overlap, irrespective of the working distance.
- the illumination path includes a beam shaping lens and a folding mirror.
- the beam shaping lens receives incident light from a light source (e.g., an optical fiber) and directs the incident light towards the folding mirror.
- the folding mirror is configured to reflect the light incident thereupon towards a first beam splitter (Beam splitter 1 ) which directs the light from the folding mirror towards an object for illumination.
- the first beam splitter is also shared with the imaging path and the display path.
- the imaging path in FIG. 5 includes an imaging sensor (e.g., CMOS sensor), an optical filter, two lenses (Lens 0 and Lens 2), and a second beam splitter (Beam splitter 2) positioned between the two lenses.
- Fluorescence signals emitted by the object upon illumination are received at the imaging sensor after propagating through the first beam splitter, the second beam splitter, the optical filter, and the two lenses.
- the second beam splitter is configured to direct incident light from the illumination, projection, and display paths along the optical axes shown in FIG. 5. While FIG. 5 shows the optical filter positioned in front of the imaging sensor, other locations for the optical filter are possible.
- the display path in FIG. 5 includes a microLED panel, two lenses (Lens 0 and Lens 1), the second beam splitter, and the first beam splitter.
- Visible light representing an image obtained by the imaging sensor is output by the microLED panel, propagates through the Lens 1 and towards the second beam splitter, which reflects the visible light towards Lens 0 along the imaging path.
- the visible light is transmitted through the first beam splitter toward, the object.
- the visible light may take the form of a pattern (e.g., the letter “A” shown in FIG. 5) which is projected onto the object and associated with the image captured by the imaging sensor.
- FIG. 5 One issue in FIG. 5 is related to the active size of the microLED panel and the imaging sensor. Typically, they are different. If the same optical system for the display path with microLED panel and the imaging path with imaging sensor are used, the imaging field (i.e., the field captured by the imaging sensor) will be different from the field displayed by the microLED panel. In order to fully utilize both microLED pixels and CMOS pixels, the configuration in FIG. 5 includes two different optical systems: one that includes lens 0 and lens 2 for the imaging path, and the other that incudes lens 0 and lens 1 for the display path so that the imaging field and display field in the surgical area are the same (for example, area A in the figure).
- the second beam splitter in FIG. 5 (which can be a dichroic mirror) is used to combine the imaging path and display path.
- this beam splitter reflects visible light and transmits NIR light.
- the beam shaping lens is used to generate uniform illumination in the surgical field to excite the fluorescence. Depending on the contrast agent used, the excitation light will have a different wavelength.
- Some of the advantages associated with the configuration of FIG. 5 include: compact dimensions, suitable for head mount application, all three fields (illumination, imaging and display) always overlap, and the imaging and display channels have the same field of view.
- FIG. 6 illustrates an example of a configuration that is compact enough to allow implementation as a head-mounted device.
- the panels in FIG. 6 illustrate example dimensions of 3 cm x 8 cm x 6 cm, with optical components that correspond to those shown in FIG. 5.
- One or more cables may connect to the configuration.
- one cable can be used to connect the configuration to a light source. Additional cables may enable signal transmission between the device and data acquisition or processing circuitry.
- FIG. 7 illustrates example lens systems that can be used in implementations of the disclosed technology such as the configuration of FIG. 6 or the optical system of FIG. 5.
- the example lens system (shown top in FIG 7) may be implemented along the imaging path in FIG.
- the location of the filter in FIG. 7 may be different from that which is shown therein.
- Another example lens system, which can be implemented along the display path of FIG. 5, is shown in the bottom of FIG. 7.
- the working distance of the lens system is adjustable, for example, by adjusting the distance between lens 0 and the beam splitter from 1 mm to 1.245 mm.
- FIG. 8 shows a flow diagram illustrating an example method 8000 for image acquisition and projection.
- the method 8000 comprises illuminating an object with infrared light to cause the object to emit fluorescence light.
- the method 8000 comprises acquiring an image by capturing the fluorescence light using an imaging sensor.
- the method 8000 comprises activating elements of a pixelated light source based on information extracted from the image.
- the method 8000 comprises projecting, towards the object and using visible light produced by the activated elements, a pattern associated with the image.
- Embodiments of the disclosed technology support inter alia the following technical solutions that solve the technical problem of producing images obtained via NIR fluorescence and projecting patterns associated with the images onto a sample.
- An optical system can include some or all of the following; a light source and its beam shaping element to provide uniform illumination, an imaging sensor with optical filter in front of it to capture fluorescence light, a display unit with image source, one or more beam splitters to combine the above three systems to a common optical axis, two optical systems, one for imaging fluorescence light, and the other for projecting visible light to the field, where the two optical systems share the optical components in front of the beamsplitter, and a processor unit for processing the captured fluorescence image and transferring the information to the image source.
- the image projection source is one of a microLED, OLED, LCoS, digital micromirror device (DMD), or laser scanner.
- An optical system for image acquisition and projection comprising: an illumination path including a first lens configured to receive infrared light and to direct the infrared light towards a sample; an imaging path including: a first beam splitter, a second lens configured to receive fluorescence light produced by the sample in response to illumination by the infrared light and to direct the fluorescence light towards the first beam splitter, and an imaging sensor configured to capture the fluorescence light after passing through the first beam splitter; and a projection path including a pixelated light source configured to project an image towards the sample using visible light, the image associated with the fluorescence light captured by the imaging sensor, the visible light passing through both the first beam splitter and the second lens.
- the first lens of technical solution 4 can correspond to the beam shaping lens shown in FIG. 5
- the first beam splitter of technical solution 4 can correspond to beam splitter 2 shown in FIG.
- the second lens of technical solution 4 can correspond to lens 0 shown in FIG. 5.
- the pixelated light source of technical solution 4 corresponds to the microLED panel shown in FIG. 5.
- the illumination path includes an infrared light source operable to generate the infrared light
- the first lens is positioned to receive the infrared light from the infrared light source and to provide uniform illumination of the sample by the infrared light.
- the projection path includes a third lens positioned between the first beam splitter and the pixelated light source
- the imaging path includes a fourth lens positioned between the first beam splitter and the imaging sensor
- the first beam splitter is positioned to split a portion of the imaging path and projection path that otherwise form a common path.
- the third lens of technical solution 7 can correspond to lens 1 shown in FIG. 5
- the fourth lens of technical solution 7 can correspond to lens 2 shown in FIG. 5.
- magnification provided by a combination of the second lens and the third lens is selected to provide a size of the image projected on the sample to substantially match a size of an image produced by the imaging sensor based on the captured fluorescence light.
- the optical system of technical solution 1 comprising a filter positioned in the imaging path having a bandpass that allows the fluorescence light produced by the sample to pass therethrough in the direction of the imaging sensor.
- the filter of technical solution 11 corresponds to the optical filter shown in FIG. 5.
- the optical system of technical solution 4 further comprising a processing unit in communication with the pixelated light source and the imaging sensor, wherein: the processing unit is configured to control illumination elements of the pixelated source, to cause the image to project towards the sample based on a pattern associated with the fluorescence light captured by the imaging sensor.
- the imaging sensor is configured to produce a fluorescence image based on the captured fluorescence light
- the processing unit is configured to generate a threshold image based on the fluorescence image and to provide information extracted from the threshold image to the pixelated light source
- the pixelated light source is configured to project the pattern based on the information.
- the optical system of technical solution 13 wherein the illumination elements are configured to produce visible light of various colors.
- the optical system of technical solution 4 including a folding mirror in the illumination path, and a second beam splitter that is common between the illumination, imaging and projection paths, wherein the folding mirror is positioned to receive the infrared light from the first lens and direct the infrared light towards the second beam splitter for illuminating the sample.
- the second beam splitter of technical solution 16 corresponds to beam splitter 1 shown in FIG. 5.
- the pixelated light source is one of a microLED, OLED, LCoS, digital micromirror device (DMD), or laser scanner.
- the illumination path is used to produce an illumination field for illuminating the sample
- the projection path is used to produce a projection field for projecting visible light towards the sample
- the first beam splitter is configured to combine optical axes of the illumination path, the imaging path, and the projection path such that the illumination field and the projection field overlap.
- a method for image acquisition and projection comprising; illuminating an object with infrared light to cause the object to emit fluorescence light; acquiring an image by capturing the fluorescence light using an imaging sensor; activating elements of a pixelated light source based on information extracted from the image; and projecting, towards the object and using visible light produced by the activated elements, a pattern associated with the image.
- Various components may be controlled or various operations may be performed via implementations using a processor/controller that is configured to include, or be coupled to, a memory that stores processor executable code that causes the processor/controller carry out various computations and processing of information.
- the processor/controller can further generate and transmit/receive suitable information to/from the various system components, as well as suitable input/output (IO) capabilities (e.g., wired or wireless) to transmit and receive commands and/or data.
- IO input/output
- the processor/controller may, for example, provide signals to control the operation of various components such as light sources and detectors that are disclosed herein.
- Various information and data processing operations described herein may be implemented in one embodiment by a computer program product, embodied in a computer- readable medium, including computer-executable instructions, such as program code, executed by computers in networked environments.
- a computer-readable medium may include removable and non-removable storage devices including, but not limited to, Read Only Memory (ROM), Random Access Memory (RAM), compact discs (CDs), digital versatile discs (DVD), cloud storage, etc. Therefore, the computer-readable media that is described in the present application comprises non- transitory storage media.
- program modules may include routines, programs, objects, omponents, data structures, etc. that perform particular tasks or implement particular abstract data types.
- Computer-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps or processes.
Landscapes
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
Abstract
Des modes de réalisation donnés à titre d'exemple de la présente invention comprennent un système optique d'acquisition et de projection d'image, comprenant : un trajet d'éclairage comprenant une première lentille conçue pour recevoir une lumière infrarouge et pour diriger la lumière infrarouge vers un échantillon ; un trajet d'imagerie comportant : un premier diviseur de faisceau, une seconde lentille configurée pour recevoir une lumière de fluorescence produite par l'échantillon en réponse à un éclairage par la lumière infrarouge et pour diriger la lumière de fluorescence vers le premier diviseur de faisceau, et un capteur d'imagerie conçu pour capturer la lumière de fluorescence après avoir traversé le premier diviseur de faisceau ; et un trajet de projection comprenant une source de lumière pixelisée conçue pour projeter une image vers l'échantillon à l'aide de la lumière visible, l'image étant associée à la lumière de fluorescence capturée par le capteur d'imagerie, la lumière visible traversant à la fois le premier diviseur de faisceau et la seconde lentille.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463625694P | 2024-01-26 | 2024-01-26 | |
| US63/625,694 | 2024-01-26 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025160560A1 true WO2025160560A1 (fr) | 2025-07-31 |
Family
ID=96545901
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2025/013227 Pending WO2025160560A1 (fr) | 2024-01-26 | 2025-01-27 | Système chirurgical compact guidé par image |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025160560A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180024341A1 (en) * | 2015-02-09 | 2018-01-25 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Augmented stereoscopic microscopy |
| US20200288094A1 (en) * | 2017-11-27 | 2020-09-10 | Panasonic Corporation | Projection device |
| WO2024006808A2 (fr) * | 2022-07-01 | 2024-01-04 | The Regents Of The University Of Michigan | Systèmes et procédés d'imagerie basée sur la fluorescence |
-
2025
- 2025-01-27 WO PCT/US2025/013227 patent/WO2025160560A1/fr active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180024341A1 (en) * | 2015-02-09 | 2018-01-25 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Augmented stereoscopic microscopy |
| US20200288094A1 (en) * | 2017-11-27 | 2020-09-10 | Panasonic Corporation | Projection device |
| WO2024006808A2 (fr) * | 2022-07-01 | 2024-01-04 | The Regents Of The University Of Michigan | Systèmes et procédés d'imagerie basée sur la fluorescence |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP3618381B2 (ja) | ビジュアルインフォメーションシステム並びに画像表示方法 | |
| US20130194548A1 (en) | Portable retinal imaging device | |
| US10716463B2 (en) | Endoscope and endoscope system | |
| CN105431076B (zh) | 视线引导装置 | |
| US10805512B2 (en) | Dual path endoscope | |
| JP7095693B2 (ja) | 医療用観察システム | |
| WO2014158263A1 (fr) | Dispositif d'imagerie rétinienne portable | |
| JP6996518B2 (ja) | 分岐光学系、撮像装置、及び撮像システム | |
| EP1731087A3 (fr) | Système vidéo compact d'endoscopie en fluorescence | |
| JP2017176811A5 (fr) | ||
| JP2005518038A (ja) | 画像取得表示装置 | |
| US20180344154A1 (en) | Wide angle stereoscopic funduscopy | |
| JPWO2018105411A1 (ja) | 画像処理装置および方法、並びに手術顕微鏡システム | |
| CN107981855B (zh) | 一种血流成像装置及内窥镜 | |
| US20250208399A1 (en) | Binocular device | |
| EP3649913A1 (fr) | Dispositif d'observation médicale | |
| CN114414046A (zh) | 观察辅助装置、信息处理方法以及程序 | |
| US11394866B2 (en) | Signal processing device, imaging device, signal processing meihod and program | |
| CN217792957U (zh) | 内窥镜系统 | |
| US11889205B2 (en) | Imaging apparatus | |
| WO2025160560A1 (fr) | Système chirurgical compact guidé par image | |
| JPH05344997A (ja) | 医用実体顕微鏡 | |
| CN208625698U (zh) | 一种血流成像装置及内窥镜 | |
| CN212415706U (zh) | 现实增强显微镜 | |
| CN117814721A (zh) | 一种医用内窥镜及其成像方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25745659 Country of ref document: EP Kind code of ref document: A1 |