WO2025006759A2 - Atténuation d'interférence de caméra endoscopique - Google Patents
Atténuation d'interférence de caméra endoscopique Download PDFInfo
- Publication number
- WO2025006759A2 WO2025006759A2 PCT/US2024/035835 US2024035835W WO2025006759A2 WO 2025006759 A2 WO2025006759 A2 WO 2025006759A2 US 2024035835 W US2024035835 W US 2024035835W WO 2025006759 A2 WO2025006759 A2 WO 2025006759A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- lasing
- console
- endoscope
- laser
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00006—Operational features of endoscopes characterised by electronic signal processing of control signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0655—Control therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
Definitions
- the present disclosure generally relates to endoscopes and particularly, but not exclusively, to mitigating interference with images capture by a camera of an endoscope when used in conjunction with a surgical laser system.
- Endoscopes are used to obtain an internal view of a patient.
- endoscopes have one or more cameras.
- the purpose of the camera is to provide an image of the target (e.g., organ, stone, or the like) on which the medical procedure is being performed or an image in the vicinity of where the medical procedure is to be performed.
- an endoscope can be used to view the interior of a kidney and/or assess kidney stones in the kidney.
- an endoscope can be used to view a ureter possibly having stones, tumors, or strictures.
- an endoscope can be used to view a bladder and its anatomy such as the ureter openings and possible treatment targets like stones or tumors.
- interference with image acquisition can cause image quality degradation and/or creation of image artifacts. In severe cases this can cause difficulty in recognizing the required tissue features and characteristics to the level that the safety of continued treatment can be impacted. This may force a physician to determine to stop or pause a treatment procedure until the image (or conditions affecting the image) improves.
- the present disclosure provides an endoscopic laser lithotripsy device and methods for such a device where image acquisition and/or processing is configured to mitigate interference from the laser.
- Some embodiments of the disclosure can be implemented as a method for an endoscope camera controller.
- the method can comprise receiving, at a processor of an endoscope console, an indication of a pulse start control signal and one or more laser pulse characteristics, the pulse start control signal to define an initiation of lasing by the lasing console and the one or more laser pulse characteristics to define the lasing; determining, by the processor, an integration timing period of a camera sensor of an endoscope based on the pulse start control signal and the one or more laser pulse characteristics; and generating, by the processor, an integration control signal, the integration control signal to cause the camera sensor to accumulate charge during integration timing period.
- the method can comprise sending the integration control signal to shutter gating circuitry of the endoscope, the shutter gating circuitry to control a shutter of the endoscope camera.
- the one or more laser pulse characteristics define a frequency of pulses to be generated by the lasing console.
- the one or more laser pulse characteristics define a pulse width of the pulses.
- determining the integration timing period of the camera sensor comprises determining a maximum duration of integration of the camera sensor based on a frame rate of the camera; and determining the integration timing period as the difference of the maximum duration of integration of the camera and the pulse width.
- the camera is a global shutter style camera or a rolling shutter style camera.
- the lasing console comprises a Thulium fiber laser source or a Holmium laser source.
- Some embodiments of the disclosure can be implemented as a method for a laser console.
- the method can comprise receiving, at a processor of a laser console, an indication of settings of an endoscope camera; determining, by the processor, laser pulse characteristics and a pulse start time for a laser source of the laser console based on the settings; and causing the laser source to generate pulses of light based on the laser pulse characteristics and the pulse start time.
- the settings comprise an indication of a frame rate and a vertical blanking period of the endoscope camera.
- determining the pulse start time is based on the vertical blanking period.
- determining the laser pulse characteristics comprises determining a frequency of the laser source based on a factor of the frame rate.
- Some embodiments of the disclosure can be implemented as a method for an endoscope console.
- the method can comprise receiving, at a processor of an endoscope console, an indication of settings of an endoscope camera; receiving, at a processor of an endoscope console, an indication of a pulse start time and one or more laser pulse characteristics, the pulse start control signal to define an initiation of lasing by the lasing console and the one or more laser pulse characteristics to define the lasing; receiving, by the processor, a plurality of image frames from a camera of an endoscope coupled to the endoscope console; identifying, by the processor, at least a portion of a one of the plurality of image frames where lasing by the lasing console overlaps with integration by the camera based on the one or more laser pulse characteristics, the pulse start time, and the settings; identifying, by the processor, at least a portion of another one of the plurality of image frames where lasing by the lasing console docs not overlap with integration by the camera based on the one or more
- the settings comprise an indication of a frame rate and a vertical blanking period of the endoscope camera.
- the one or more laser pulse characteristics define a frequency of pulses to be generated by the lasing console.
- the one or more laser pulse characteristics define a pulse width of the pulses.
- the controller can comprise a processor and memory coupled to the processor, the memory comprising instructions, which when executed by the processor cause the endoscope camera controller to receive an indication of a pulse start control signal and one or more laser pulse characteristics, the pulse start control signal to define an initiation of lasing by a lasing console and the one or more laser pulse characteristics to define the lasing; determine an integration timing period of a camera sensor of an endoscope based on the pulse start control signal and the one or more laser pulse characteristics; and generate an integration control signal, the integration control signal to cause the camera sensor to accumulate charge during integration timing period.
- the instructions when executed by the processor further cause the endoscope camera controller to send the integration control signal to shutter gating circuitry of the endoscope, the shutter gating circuitry to control a shutter of the endoscope camera.
- the one or more laser pulse characteristics define a frequency of pulses to be generated by the lasing console.
- the one or more laser pulse characteristics define a pulse width of the pulses.
- the instructions when executed by the processor further cause the endoscope camera controller to determine a maximum duration of integration of the camera sensor based on a frame rate of the camera; and determine the integration timing period as the difference of the maximum duration of integration of the camera and the pulse width.
- the camera is a global shutter style camera or a rolling shutter style camera.
- the lasing console comprises a Thulium fiber laser source or a Holmium laser source.
- the laser console can comprise a processor and memory coupled to the processor, the memory comprising instructions, which when executed by the processor cause the laser console to receive an indication of settings of an endoscope camera; determine laser pulse characteristics and a pulse start time for a laser source of the laser console based on the settings; and cause the laser source to generate pulses of light based on the laser pulse characteristics and the pulse start time.
- the settings comprise an indication of a frame rate and a vertical blanking period of the endoscope camera.
- the instructions when executed by the processor further cause the laser console to determine the pulse start time based on the vertical blanking period.
- the instructions when executed by the processor further cause the laser console to determine the laser pulse characteristics comprises determining a frequency of the laser source based on a factor of the frame rate.
- the laser source is a Thulium fiber laser source.
- the laser source is a Holmium laser source.
- the endoscope camera is a global shutter style camera or a rolling shutter style camera.
- the console can comprise a processor and memory coupled to the processor, the memory comprising instructions, which when executed by the processor cause the endoscope console to receive an indication of settings of an endoscope camera; receive an indication of a pulse start time and one or more laser pulse characteristics, the pulse start control signal to define an initiation of lasing by the lasing console and the one or more laser pulse characteristics to define the lasing; receive a plurality of image frames from a camera of an endoscope coupled to the endoscope console; identify at least a portion of a one of the plurality of image frames where lasing by the lasing console overlaps with integration by the camera based on the one or more laser pulse characteristics, the pulse start time, and the settings; identify at least a portion of another one of the plurality of image frames where lasing by the lasing console does not overlap with integration by the camera based on the one or more laser pulse characteristics, the pulse start time, and the settings; and replace at least the
- the settings comprise an indication of a frame rate and a vertical blanking period of the endoscope camera.
- the one or more laser pulse characteristics define a frequency of pulses to be generated by the lasing console.
- the one or more laser pulse characteristics define a pulse width of the pulses.
- the camera is a global shutter style camera or a rolling shutter style camera.
- the lasing console comprises a Thulium fiber laser source or a Holmium laser source.
- FIG. 1A illustrates a lithotripsy system in accordance with at least one embodiment of the present disclosure.
- FIG. IB illustrates a portion of the lithotripsy system of FIG. 1A in greater detail.
- FIG. 1C illustrates another portion of the lithotripsy system of FIG. 1A in a particular use case.
- FIG. ID illustrates the other portion of the lithotripsy system of FIG. 1C in another particular use case.
- FIGS. 2A and 2B illustrate images of an endoscope showing interference from laser light emitted by a fiber optic cable inserted through a working channel of the endoscope.
- FIGS. 3A and 3B illustrate further images of an endoscope showing interference from laser light emitted by a fiber optic cable inserted through a working channel of the endoscope.
- FIG. 4 illustrates endoscope system in accordance with at least one embodiment of the present disclosure.
- FIG. 5 illustrates a method for synchronizing an integration phase of an endoscope camera with lasing by a laser source in accordance with at least one embodiment of the present disclosure.
- FIG. 6 illustrates a method for synchronizing an illumination source of an endoscope camera with lasing by a laser source in accordance with at least one embodiment of the present disclosure.
- FIG. 7 illustrates another endoscope system in accordance with at least one embodiment of the present disclosure.
- FIG. 8 illustrates a method for synchronizing lasing by a laser source with an integration phase of an endoscope camera in accordance with at least one embodiment of the present disclosure.
- FIG. 9 illustrates yet another endoscope system in accordance with at least one embodiment of the present disclosure.
- FIG. 10 illustrates a method for generating updated image frames to remove interference from an overlapping of an integration phase of an endoscope camera with lasing of a laser source in accordance with at least one embodiment of the present disclosure.
- FIG. 11 illustrates a computer-readable storage medium.
- FIG. 12 illustrates a diagrammatic representation of a machine.
- the present disclosure provides systems and techniques to mitigate interference of endoscopic camera operation. It will be appreciated that such interference can occur at various points during the timeline of treatment and can disrupt some video frames, while leaving other video frames uninfluenced. In cases of sporadic interference, only temporary effects may be present leaving the viewed scene understandable to the physician. In such a case, this can cause a nuisance to the physician but may still allow for safe operation and not require a pause in the procedure. In other cases, such as where the interference occurs at a higher frequency or where the interference is synchronized with the camera frame rate, the effects on the image may be constant, thereby disrupting all or a portion of the image of the scene. In such a case, the ability to continue safe operation may be unavailable to the physician necessitating a pause or cessation of the procedure.
- medical lasers are used in a variety of endoscopic procedures where laser light is directed to a target through an optical fiber.
- One such procedure to address renal calculi (e.g., kidney stones) is ureteral endoscopy, or lithotripsy.
- lithotripsy an endoscopic probe, with a camera or other sensor, is inserted into the patient’s urinary tract to locate the calculi for removal.
- An optical fiber is inserted through a working channel of the probe and laser energy can be directed towards the calculi via the optical fiber to disintegrate the calculi as they are found via the camera.
- FIG. 1A to FIG. ID show an exemplary lithotripsy system 100 configured to mitigate interference in image acquisition.
- Lithotripsy system 100 can comprise an endoscope console 102 and a lasing console 104, coupled to an endoscope 106 and an optical fiber 108, respectively.
- the endoscope 106 can be any of a variety of “scopes” such as, for example, a ureteroscope, a colonoscope, a bronchoscope, or the like.
- the endoscope 106 can include one or more working channels in which the optical fiber 108 can be inserted.
- endoscope 106 can include a camera 110 and a lens 112 (see FIG. 1C and FIG. ID) of which an image of a target 114 can be acquired.
- the target 114 may be a tissue, a stone, a tumor, a cyst, and the like, within a subject, which is to be treated, ablated, or destroyed.
- the subject may be a human being or an animal.
- imaging endoscopes exit.
- some endoscopes include a camera on the distal end (e.g., as shown in FIG. 1).
- other cameras include an optical channel (e.g., an optical waveguide, a fiber bundle, or the like) coupled to an external camera or a camera on the proximal portion of the endoscope. Examples are not limited in this context.
- endoscope 106 can include an illumination source 128 (e.g., an LED, an optical channel coupled to at its proximal end, or the like).
- the illumination source 128 can be activated to illuminate the environment in which the endoscope distal end is disposed. It will be appreciated that such environment often lacks ambient light so illumination source 128 is provided to illuminate the environment.
- Lasing console 104 can comprise an optical system 116 arranged to generate incident light 118 which can be directed to target 114 via optical fiber 108.
- the optical fiber 108 comprises a proximal end 120 and a distal end 122.
- the proximal end 120 is the end of the optical fiber 108 through which light beams enter while the distal end 122 is the end of the optical fiber 108 through which light beams are emitted and via which light beams can be directed onto the target 114.
- this figure depicts incident light 118 entering the optical fiber 108 at the proximal end 120, propagating through length of the optical fiber 108, exiting the optical fiber 108 at the distal end 122, and being incident on the target 114 from the distal end 122 of the optical fiber 108.
- optical system 116 can comprise one or more laser light sources and various optics arranged to generate incident light 118 and couple incident light 118 to optical fiber 108.
- the laser light sources may include, but are not limited to, solid-state lasers, gas lasers, diode lasers, and fiber lasers.
- the light beams may include one or more of an aiming beam, a treatment beam, and any other beam transmitted through the optical fiber 108.
- an aiming beam may include a light beam of low intensity that is transmitted through the optical fiber 108 to illuminate or highlight the target 114 while a treatment beam may include a light beam of high intensity that is transmitted through the optical fiber 108 to treat the target 114.
- the various optics within optical system 116 can include, but are not limited to, one or more polarizers, beam splitters, beam combiners, light detector, wavelength division multiplexers, collimators, circulators, and/or lenses.
- the laser light sources of optical system 116 can comprise a Thulium fiber laser, a Holmium laser, or other types of laser light sources.
- endoscope console 102 and lasing console 104 are depicted as separate consoles in FIG. 1A, it is to be appreciated that an embodiment could be implemented where endoscope console 102 and lasing console 104 are combined into a single console, for example, sharing many computing components (c.g., processor, memory, display, controls, etc.). However, for purposes of clarity, the disclosure describes examples where endoscope console 102 and lasing console 104 are separate.
- FIG. 1C and FIG. ID camera 110 and lens 112 of endoscope 106 are depicted. As noted above, lens 112 is configured to provide a focal length 124a for camera 110 of a specific distance.
- the vapor bubble 126 acts as another optical element and changes the focal length 124a to a different focal length 124b due to the different index of refraction of the vapor bubble 126, in addition to causing distortion due to irregularity of the vapor bubble.
- laser energy interacting with the tissue can cause additional optical effects that impact the quality of the acquired images.
- tissue e.g., prostate tissue, stone, etc.
- a short-lived plasma volume can be created when a laser beam is incident on a stone, which causes an extremely bright flash. While short lived, this bright flash will interfere with the image acquisition of the camera 110.
- effects such as brightness saturation and/or charge-coupled device (CCD) vertical streaking can be captured by camera 110 thereby degrading the quality of the acquired image frame. Often, such effects will degrade the quality of the image frame down to a level that is unusable to a physician.
- CCD charge-coupled device
- Lasers often used in endoscopic procedures typically operate at frequencies comparable to the frame rate of the endoscope camera. As such, light from the laser can create significant interference resulting in degradation of the video quality. Further, it will be appreciated that for typical laser-based treatments in endoscopic applications (e.g., for urology, or the like) the laser wavelength is selected so that it has a high absorption coefficient in water. This is done for providing a margin of safety so that laser energy fired in a direction that misses the target will not travel too far before it is absorbed by the liquid in the treatment environment, and thus will not damage other tissue or organs. High water absorption also serves as a means for acting on the target tissue, which usually contains high amount of water.
- a side effect of the high absorption coefficient in water is that a vapor bubble is created in the liquid medium when the laser energy is delivered.
- the vapor inside the bubble has a different refraction index than liquid water. Therefore, the vapor bubble effectively creates an optical element in the water, which lasts for the duration of the bubble.
- Endoscopic camera optics are designed to be focused at a typical operating distance, while operating in water. Adding an additional “lens” between the camera and the target (e.g., due to the vapor bubble) causes defocusing and/or distortion of the image, resulting in blurring of the captured image.
- FIG. 2A illustrates an endoscopic image 200a, which could be captured by camera 110 of endoscope 106 when a vapor bubble 126 is not extant.
- the endoscopic image 200a is relatively clear and/or free from distortions or blurring.
- FIG. 2B illustrates an endoscopic image 200b, which could be captured by camera 110 of endoscope 106 when a vapor bubble 126 is extant.
- endoscopic image 200b is blurry due to the change in effective focusing distance (e.g., focusing distance 124a to 124b) and distortion due to the vapor bubble 126.
- images 200a and 200b correspond to a frame and a next subsequent frame that immediately follows it.
- the distortion of the image due to the vapor bubble 126 is instantaneous.
- FIG. 3A illustrates another example endoscopic image 300a, which could be captured by camera 110 of endoscope 106 when a vapor bubble 126 is not extant.
- endoscopic image 300a is relatively clear and/or free from distortions or blurring.
- FIG. 3B illustrates an endoscopic image 300b, which could be captured by camera 110 of endoscope 106 when a vapor bubble 126 is extant.
- endoscopic image 300b is blurry due to the change in effective focusing distance (e.g., focusing distance 124a to 124b) and distortion due to the vapor bubble 126.
- the present disclosure describes a number of example implementations to mitigate this type of interference, or blurring of the endoscope images resulting from either the change in focal length of the camera 110 when the vapor bubble 126 is extant and/or flashes of light resulting from plasma volumes.
- FIG. 4 illustrates an endoscope system 400, in accordance with non-limiting examples of the present disclosure.
- endoscope system 400 is a system for viewing an image with an endoscope that is configured to reduce interference from lasing.
- Endoscope system 400 includes endoscope console 102 arranged to be coupled to lasing console 104 and endoscope 106.
- endoscope console 102 is arranged to receiving information about laser energy (e.g., laser beams, pulses of laser light, or the like) generated or to be generated by lasing console 104 and to synchronize capturing images from camera 110 in endoscope 106 using the received information about the laser energy to reduce interference with the captured images.
- laser energy e.g., laser beams, pulses of laser light, or the like
- Endoscope console 102 can be any of a variety of computing devices. With some embodiments, endoscope console 102 can be a workstation, server, laptop, or tablet communicatively coupled to endoscope 106 and lasing console 104.
- Endoscope console 102 can include processor 402, memory 404, input and/or output (I/O) devices 406, and network interface 408.
- the processor 402 may include circuity or processor logic, such as, for example, any of a variety of commercial processors.
- processor 402 may include multiple processors, a multi-threaded processor, a multi-core processor (whether the multiple cores coexist on the same or separate dies), and/or a multi-processor architecture of some other variety by which multiple physically separate processors are in some way linked.
- the processor 402 may include graphics processing portions and may include dedicated memory, multiple-threaded processing and/or some other parallel processing capability.
- the processor 402 may be an application specific integrated circuit (ASIC) or a field programmable integrated circuit (FPGA).
- ASIC application specific integrated circuit
- FPGA field programmable integrated circuit
- the memory 404 may include logic, a portion of which includes arrays of integrated circuits, forming non-volatile memory to persistently store data or a combination of non-volatile memory and volatile memory. It is to be appreciated, that the memory 404 may be based on any of a variety of technologies.
- the arrays of integrated circuits included in memory 120 may be arranged to form one or more types of memory, such as, for example, dynamic random-access memory (DRAM), NAND memory, NOR memory, or the like.
- DRAM dynamic random-access memory
- NAND NAND memory
- NOR memory or the like.
- I/O devices 406 can be any of a variety of devices to receive input and/or provide output.
- I/O devices 406 can include, a keyboard, a mouse, a joystick, a foot pedal, a display, a touch enabled display, a haptic feedback device, an LED, or the like.
- Network interface 408 can include logic and/or features to support a communication interface.
- network interface 408 may include one or more interfaces that operate according to various communication protocols or standards to communicate over direct or network communication links. Direct communications may occur via use of communication protocols or standards described in one or more industry standards (including progenies and variants).
- network interface 408 may facilitate communication over a bus, such as, for example, peripheral component interconnect express (PCIe), non-volatile memory express (NVMe), universal serial bus (USB), system management bus (SMBus), SAS (e.g., serial attached small computer system interface (SCSI)) interfaces, serial AT attachment (SATA) interfaces, or the like.
- PCIe peripheral component interconnect express
- NVMe non-volatile memory express
- USB universal serial bus
- SMBs system management bus
- SAS e.g., serial attached small computer system interface (SCSI) interfaces, serial AT attachment (SATA) interfaces, or the like.
- network interface 408 can include logic and/or features to enable communication over a variety of wired or wireless network standards (e.g., 802.11 communication standards).
- network interface 408 may be arranged to support wired communication protocols or standards, such as, Ethernet, or the like.
- network interface 408 may be arranged to support wireless communication protocols or standards, such as, for example, Wi-Fi, Bluetooth, ZigBee, LTE, 5G, or the like.
- Memory 404 can include instructions 410, laser pulse characteristics 412, laser pulse start signal 414, camera integration timing period 416, control signals 418, and image frames 420.
- processor 402 can execute instructions 410 to cause endoscope console 102 to receive laser pulse characteristics 412 and laser pulse start signal 414 from lasing console 104.
- Laser pulse characteristics 412 can comprise an indication of the pulse repetition rate (e.g., laser pulse frequency, or the like) as well as the duration of the pulse (e.g., pulse width, or the like).
- processor 402 can execute instructions 410 to generate camera integration timing period 416 from laser pulse characteristics 412 and laser pulse start signal 414.
- digital cameras like camera 110, operate by accumulating charge (or integrating charge) responsive to incident light at points (e.g., pixel cells, frame lines, etc.) in a frame and then “reading out” the charge to a memory location.
- the overall “image acquisition” process for a digital camera includes 4 main parts: (1) clear or reset where each pixel cell is cleared of any accumulated charge, (2) integrate (e.g., accumulate charge) in the pixel cells responsive to incident light, (3) transfer the charge out of the cells, and (4) read out pixel data to a memory storage location. Accordingly, interference from lasing (e.g., either the laser pulse itself or plasma volumes) will only affect the captured image when the interference and integration of charge overlap.
- Camera 110 can include a camera sensor 422, a shutter 424, and shutter control circuitry and mechanism 426.
- the shutter control circuitry and mechanism 426 can control the opening and closing of shutter 424 to allow or block light from being incident on the camera sensor 422.
- shutter control circuitry and mechanism 426 can cause shutter 424 to open to allow incident light to imping on camera sensor 422 such that charge can accumulate in the lines (e.g., capacitors, cells, or the like) of the camera sensor 422.
- lines e.g., capacitors, cells, or the like
- the present disclosure provides to generate camera integration timing period 416 comprising an indication of when to integrate or accumulate charge in the pixel cells (or pixel lines) so that the integration will take place outside times when interference is present.
- Processor 402 can execute instructions 410 to generate control signals 418 from camera integration timing period 416 and communicate control signals 418 to endoscope 106 (and particularly camera 110) such that the image frames 420 can be captured by camera 110 and received by endoscope console 102 where interference from laser energy emitted by lasing console 104 is mitigated.
- camera 110 can be implemented with a variety of different camera shutter sensors, such as, for example, a global shutter sensor or a rolling shutter sensor. It will further be appreciated that the difference between a global shutter sensor and a rolling shutter sensor is that integration for all cells is simultaneous while memory read out is staggered whereas integration and memory readout is staggered for rolling shutter sensors. Accordingly, if interference overlaps the integration period for a global shutter sensor, the interference will affect all sensor lines while it will only some of the sensor lines for a rolling shutter sensor.
- Lasers can be operated at repetition rates similar, or identical to, camera frame rates. As such, many synchronization effects that cause impacts on a time scale beyond interference with a particular frame can result. For example, for a global shutter camera operating at 30 Hertz (Hz) and a laser operating at 30 Hz, the frequencies could be synchronized by chance so that lasing always occurs during the camera readout phase and no interference is registered in the image frames. In another example, the frequencies could be synchronized so that lasing always occurs during the integrating phase, and as such, all image frames will have interference. In yet another example, for a global shutter camera operating at 30 Hz and a laser operating at 60 Hz all image frames will have interference from at least 1 laser pulse.
- Hz Hertz
- every 6th image frame may have interference, which may manifest as a “flickering” of the video signal.
- some of the image frame lines will always have interference creating a pattern of repeating interference on the resulting video.
- the interference will slowly “drift” across the image frames and manifest and drifting interference on the video.
- the present disclosure provides the endoscope system 400 configured to synchronize image capture and particularly integration or accumulation of charge in the image sensor of camera 110 with lasing of lasing console 104 based on a variety of different techniques, which can be implemented by endoscope system 400 based on the type of camera sensor (e.g., global shutter, rolling shutter), the frequency of the camera 110, and/or the frequency of the lasing console 104.
- the type of camera sensor e.g., global shutter, rolling shutter
- the frequency of the camera 110 e.g., and/or the frequency of the lasing console 104.
- FIG. 5 illustrates a method 500 to synchronize the integration phase of camera with a lasing console.
- endoscope console 102 can implement method 500 to synchronize the integration phase of camera 110 with lasing by lasing console 104.
- processor 402 can execute instructions 410 to cause endoscope console 102 to implement method 500.
- Method 500 can begin at block 502.
- a pulse start control signal and one or more laser pulse characteristics can be received from a lasing console.
- the pulse start control signal can define an initiation of lasing by the lasing console while the one or more laser pulse characteristics can define the lasing.
- processor 402 can execute instructions 410 to receive laser pulse start signal 414 and laser pulse characteristics 412.
- laser pulse start signal 414 can correspond to a signal that defines the initiation of lasing by lasing console 104 while laser pulse characteristics 412 can define the lasing itself (e.g., pulse width, frequency, etc.)
- an integration timing period for a camera sensor can be determined based on the pulse start control signal and the one or more laser pulse characteristics.
- processor 402 can execute instructions 410 to determine camera integration timing period 416 based on laser pulse start signal 414 and laser pulse characteristics 412 where camera integration timing period 416 where camera integration timing period 416 defines a period over which integration is to take place.
- processor 402 can execute instructions 410 to determine a period in which to open shutter 424 such that camera sensor 422 can accumulate charge when the lasing console 104 is not lasing. In many embodiments, this will result in a shortened integration phase than conventional camera operation. This will be described in greater detail below.
- an integration control signal can be generated where the integration control signal is configured to cause a camera sensor to accumulate charge during a period specified by the control signal.
- processor 402 can execute instructions 410 to generate control signals 418 where control signals 418 can cause shutter 424 to open during the integration timing period (e.g., camera integration timing period 416, or the like) such that camera sensor 422 can accumulate charge during the integration timing period.
- the control signal can be sent to the camera.
- processor 402 can execute instructions 410 to send the control signals 418 to the camera 110 or rather, to shutter control circuitry and mechanism 426 which can cause shutter 424 to open and/or close based on camera integration timing period 416.
- the phrases open shutter 424 and close shutter 424 could apply to either opening and closing a mechanical shutter or could apply to starting and stopping an integration phase of an electronic shutter.
- shutter 424 may not be a physical mechanical shutter but instead may be an electronic shutter arranged to accumulate charge in sensor lines and read the accumulated charge from the sensor lines.
- the phrase open shutter 424 could in such an example mean initiating an accumulation of charge in the sensor lines while the phrase close shutter 424 could mean stopping the accumulation of charge.
- the method 500 can further include blocks (not shown) where information elements comprising indications of frame data can be received from the camera, processed, and displayed on a display.
- processor 402 can execute instructions 410 to receive image frames 420 from camera 110 (e.g., via a memory readout process, or the like) and can process the image frames 420 to generate images to display on a display coupled to the endoscope system 400.
- the method 500 can be implemented in real time, or rather, during a procedure, such that capturing images by camera 110 of endoscope 106 can be synchronized with lasing by lasing console 104 to reduce and/or mitigate interference manifest in the captured images.
- method 500 can be provided to generate the integration timing period (e.g., camera integration timing period 416) in a variety of different ways.
- processor 402 can execute instructions 410 to determine reduce a duration of the maximum integration period by a duration of the interference to determine the integration timing period.
- Processor 402 can execute instructions 410 to determine a duration of interference based on the laser pulse characteristics 412 and laser pulse start signal 414. For example, for a 30 Hz laser pulse with a pulse width of 1 millisecond (ms) processor 402 can execute instructions 410 to determine that the duration of the interference is 1 ms, which repeats every 33 ms.
- ms millisecond
- processor 402 can determine that the maximum duration of the integration timing period can be 33 ms. Given the duration of the interference (e.g., based on laser pulse characteristics 412 and laser pulse start signal 414) and the maximum duration of the integration timing period (e.g., based on settings for camera 110), processor 402 can execute instructions 410 to determine camera integration timing period 416 by subtracting the duration of the interference from the maximum duration of the integration timing period (e.g., 33 ms minus 1 ms in the example above). As integration for all memory lines is done in parallel in a global shutter camera, accumulation of charge for all lines or cells will be shortened. Further, processor 402 can execute instructions 410 to generate control signals 418 such that the shutter 424 of camera 110 will open for the determined camera integration timing period 416 outside the time the laser beam is activated (e.g., based on laser pulse characteristics 412).
- processor 402 can execute instructions 410 to determine camera integration timing period 416 in a similar manner to that described for a global shutter camera above.
- integration is done in a rolling (e.g., often overlapping) fashion, accumulation of charge for certain lines will be shortened, or paused, while integration for other cells will be unaffected.
- memory readout is done in parallel to integration, shortening the integration phase for some lines may result in a delay in memory readout for those lines. As such, larger memory buffers and/or memory processing circuitry on the receiving hardware side may be needed to address the delay and reduce the chance of non-contiguous frames being generated.
- FIG. 6 illustrates a method 600 to synchronize the illumination of an environment (and thus the integration phase of a camera) with a lasing console.
- endoscope console 102 can implement method 600 to synchronize the activation of illumination source 128 with lasing by lasing console 104.
- processor 402 can execute instructions 410 to cause endoscope console 102 to implement method 600.
- Method 600 can begin at block 602.
- a pulse start control signal and one or more laser pulse characteristics can be received from a lasing console.
- the pulse start control signal can define an initiation of lasing by the lasing console while the one or more laser pulse characteristics can define the lasing.
- processor 402 can execute instructions 410 to receive laser pulse start signal 414 and laser pulse characteristics 412.
- laser pulse start signal 414 can correspond to a signal that defines the initiation of lasing by lasing console 104 while laser pulse characteristics 412 can define the lasing itself (e.g., pulse width, frequency, etc.)
- an integration timing period for a camera sensor can be determined based on the pulse start control signal and the one or more laser pulse characteristics.
- processor 402 can execute instructions 410 to determine camera integration timing period 416 based on laser pulse start signal 414 and laser pulse characteristics 412 where camera integration timing period 416 where camera integration timing period 416 defines a period over which integration is to take place.
- processor 402 can execute instructions 410 to determine a period in which to open shutter 424 such that camera sensor 422 can accumulate charge when the lasing console 104 is not lasing. In many embodiments, this will result in a shortened integration phase than conventional camera operation. This will be described in greater detail below.
- an illumination control signal can be generated where the illumination control signal is configured to cause an illumination source to activate during a period specified by the control signal.
- processor 402 can execute instructions 410 to generate control signals 418 where control signals 418 can cause illumination source 128 to activate during the integration timing period (e.g., camera integration timing period 416, or the like) such that even when camera sensor 422 accumulates charge, accent lighting (e.g., non-ambient light) will only be provided during the integration timing period.
- control signal can be sent to the illumination source.
- processor 402 can execute instructions 410 to send the control signals 418 to the illumination source 128.
- the method 600 can further include blocks (not shown) where information elements comprising indications of frame data can be received from the camera, processed, and displayed on a display.
- processor 402 can execute instructions 410 to receive image frames 420 from camera 110 (e.g., via a memory readout process, or the like) and can process the image frames 420 to generate images to display on a display coupled to the endoscope system 400.
- the method 600 can be implemented in real time, or rather, during a procedure, such that capturing images by camera 110 of endoscope 106 can be synchronized with lasing by lasing console 104 to reduce and/or mitigate interference manifest in the captured images.
- FIG. 7 illustrates an endoscope system 700, in accordance with non-limiting examples of the present disclosure.
- endoscope system 700 is a system for viewing an image with an endoscope that is configured to reduce interference from lasing.
- Endoscope system 700 includes many of the same components described above with respect to endoscope system 400 and FIG. 4. The difference being that endoscope system 700 is configured to change the shutter settings of camera 110 to reduce interference while endoscope system 700 is configured to change the settings of the lasing console 104 to reduce interference.
- memory 404 can include instructions 708, camera settings 702, laser pulse characteristics 704, laser pulse start signal 706, and image frames 420.
- processor 402 can execute instructions 708 to cause endoscope console 102 to receive camera settings 702 from camera 110.
- Camera settings 702 can comprise an indication of the frequency of camera 110 operation, the shutter type, and/or memory readout settings, or the like.
- processor 402 can execute instructions 708 to generate laser pulse characteristics 704 and laser pulse start signal 706 from camera settings 702.
- processor 402 can execute instructions 410 to determine period where integration is not taking place based on camera settings 702 (e.g., period of memory readout for global shutter type cameras, vertical blanking periods, or the like) and determine a laser pulse characteristics 704 (e.g., pulse repetition rate, pulse duration, etc.) and laser pulse start signal 706 such that lasing by lasing console 104 takes place during these periods (e.g., outside the integration phase).
- Processor 402 can further execute instructions 708 to send information elements comprising indications of laser pulse characteristics 704 and laser pulse start signal 706 to lasing console 104 such that lasing console 104 can generate pulses of light based on laser pulse characteristics 704 and laser pulse start signal 706.
- laser pulse characteristics 704 will need to be synchronized with the frequency or frame rate of the camera 110.
- laser pulse characteristics 704 can comprise a pulse rate of 30 Hz, 15 Hz, 10 Hz, 6 Hz, 5 Hz, or 3 Hz, but cannot exceed 30 Hz.
- endoscope system 700 may not be practical for all laser system applications (e.g., those operating at 80 Hz, or the like).
- FIG. 8 illustrates a method 800 to synchronize lasing of a laser source with the integration phase of an endoscope camera.
- endoscope console 102 can implement method 800 to synchronize lasing by lasing console 104 with the integration phase of camera 110.
- processor 402 can execute instructions 708 to cause endoscope console 102 to implement method 800.
- Method 800 can begin at block 802.
- settings of the camera can be received from an endoscope camera.
- processor 402 can execute instructions 708 to receive camera settings 702 from camera 110.
- camera settings 702 can correspond to the shutter type, frame rate, memory readout settings, timing of a vertical blanking interval, or the like.
- laser pulse characteristics and a pulse start time for a laser source can be determined based on the camera settings.
- processor 402 can execute instructions 708 to determine laser pulse characteristics 704 and laser pulse start signal 706 based on camera settings 702.
- laser pulse characteristics 704 can include a pulse repetition rate and a pulse duration.
- the pulse repetition rate will be a factor of the frame rate of the camera 110, but will not exceed the frame rate of the camera.
- the pulse duration will be generated such that given the pulse repetition rate, the laser pulse is generated in a time when integration is not taking place (e.g., during memory readout, during a vertical blanking interval, or the like).
- processor 402 can execute instructions 708 to send laser pulse characteristics 704 and laser pulse start signal 706 to lasing console 104 and cause lasing console 104 to generate laser pulses based on laser pulse characteristics 704 and laser pulse start signal 706.
- method 800 can further include blocks (not shown) where information elements comprising indications of frame data can be received from the camera, processed, and displayed on a display.
- processor 402 can execute instructions 410 to receive image frames 420 from camera 110 (e.g., via a memory readout process, or the like) and can process the image frames 420 to generate images to display on a display coupled to the endoscope system 400.
- the method 800 can be implemented in real time, or rather during a procedure, such that lasing by lasing console 104 can be synchronized with capturing images by camera 110 of endoscope 106 to reduce and/or mitigate interference manifest in the captured images.
- FIG. 9 illustrates an endoscope system 900, in accordance with non-limiting examples of the present disclosure.
- endoscope system 900 is a system for viewing an image with an endoscope that is configured to reduce interference from lasing.
- Endoscope system 900 includes many of the same components described above with respect to endoscope system 400 of FIG. 4 and endoscope system 700 of FIG. 7. The difference being that endoscope system 900 is configured to identify frames and/or portions of a frame where interference occurred based on the above detailed camera/laser synchronization details and generate adjusted frames to remove the interference.
- memory 404 can include instructions 902, camera settings 702, laser pulse characteristics 412, laser pulse start signal 414, image frames 420, and adjusted image frames 904.
- processor 402 can execute instructions 902 to cause endoscope console 102 to receive camera settings 702 from camera 110.
- Camera settings 702 can comprise an indication of the frequency of camera 110 operation, the shutter type, and/or memory readout settings, or the like.
- processor 402 can execute instructions 902 to cause endoscope console 102 to receive laser pulse characteristics 412 (e.g., frequency, duration, etc.) and laser pulse start signal 414 from lasing console 104.
- processor 402 can execute instructions 902 to receive image frames 420 from endoscope 106.
- processor 402 can execute instructions 902 to determine frames of image frames 420 and/or portions of a frame in image frames 420 in which interference occurred (e.g., where lasing by lasing console 104 overlapped with an integration phase by camera 110). Processor 402 can execute instructions 902 to replace the identifies frames or portions of frames with a prior one of image frames 420 (or a portion of a prior frame as may be the case).
- processor 402 can execute instructions 902 to determine frames from image frames 420 where lasing by lasing console 104 overlapped with integration by camera 110 and replace the identified frames with the most recent prior frame where lasing by lasing console 104 did not overlap with integration by camera 110.
- processor 402 can execute instructions 902 to determine which lines of a frame integration by camera 110 overlapped with lasing by lasing console 104 and replace the lines of the frame with prior lines where integration by camera 110 did not overlap with lasing by lasing console 104.
- endoscope system 900 may be limited to situations where the scene being imaged by camera 110 of endoscope 106 is changing slowly (e.g., with little change between the current frame and prior frame). Further, endoscope system 900 may be more practical where the repetition rate of the laser pulse generated by lasing console 104 is slower than the frame rate of camera 110, and as such, the interference will not manifest in every frame, and even if it does manifest in every frame, it will not manifest in the same video lines of each frame.
- camera 110 of endoscope 106 can have a frame rate that is higher than a repetition rate of the lasing console 104.
- Processor 402 can execute instructions 902 to identify frames of image frames (e.g., one or more frames of the several image frames 420, or the like) captured when and lasing console 104 is emitting a laser beam (e.g., resulting in possible interference in the captured image).
- Processor 402 can execute instructions 902 to generate adjusted image frames 904 by removing the identified image frames from the set of image frames. Said differently, processor 402 can execute instructions 902 to remove the identified image frames from the image frames 420 to form the adjusted image frames 904. Further, processor 402 can execute instructions 902 to recreate and/or render a video stream based on the adjusted image frames 904.
- FIG. 10 illustrates a method 1000 to remove interference from image frames captured by an endoscope camera based on synchronizing lasing of a laser source with the integration phase of the endoscope camera.
- endoscope console 102 can implement method 1000 to generate adjusted or updated frames by replacing a frame (or portions of the frame) where integration of camera 110 of endoscope 106 overlapped with another frame (or portions of the other frame) where lasing by lasing console 104 did not overlap with integration by camera 110 of endoscope 106.
- Method 1000 can begin at block 1002.
- settings of the camera can be received from an endoscope camera.
- processor 402 can execute instructions 902 to receive camera settings 702 from camera 110.
- camera settings 702 can correspond to the shutter type, frame rate, memory readout settings, timing of a vertical blanking interval, or the like.
- a pulse start control signal and one or more laser pulse characteristics can be received from a lasing console.
- the pulse start control signal can define an initiation of lasing by the lasing console while the one or more laser pulse characteristics can define the lasing.
- processor 402 can execute instructions 902 to receive laser pulse start signal 414 and laser pulse characteristics 412.
- laser pulse start signal 414 can correspond to a signal that defines the initiation of lasing by lasing console 104 while laser pulse characteristics 412 can define the lasing itself (e.g., pulse width, frequency, etc.)
- indications of image frames from the endoscope camera indication of image frames can be received from the camera of the endoscope.
- processor 402 can execute instructions 902 to receive image frames 420 from camera 110 of endoscope 106.
- block 1008 identify, by the processor, a one of the image frames or a portion of the one of the image frames where lasing by the lasing console overlapped with integration by the endoscope camera based on the laser pulse characteristics, the pulse start time, and the camera settings” a frame or portions of the frame of the received frames where integration overlapped with lasing can be identified.
- processor 402 can execute instructions 902 to identify a frame from image frames 420, or a portion (e.g., lines) of the frame of image frames 420 where integration of the camera 110 of endoscope 106 overlapped with lasing by lasing console 104 based on camera settings 702, laser pulse characteristics 412, and laser pulse start signal 414.
- processor 402 can execute instructions 902 to identify a frame from image frames 420, or a portion (e.g., lines) of the frame of image frames 420 where integration of the camera 110 of endoscope 106 did not overlap with lasing by lasing console 104 based on camera settings 702, laser pulse characteristics 412, and laser pulse start signal 414. It is noted that the frame identified at block 1010 will be prior to the frame identified at block 1008.
- the frame (or portions of the frame) identified at block 1008 can be replaced by the frame (or portions of the frame) identified at block 1010.
- processor 402 can execute instructions 902 to generate adjusted image frames 904 by replacing the frame (or portions of the frame) from image frames 420 identified as having overlapping integration and lasing with the prior frame (or portions of the prior frame) identified as not having overlapping integration and lasing.
- FIG. 11 illustrates computer-readable storage medium 1100.
- Computer- readable storage medium 1100 may comprise any non-transitory computer-readable storage medium or machine-readable storage medium, such as an optical, magnetic or semiconductor storage medium. In various embodiments, computer-readable storage medium 1100 may comprise an article of manufacture.
- computer-readable storage medium 1100 may store computer executable instructions 1102 with which circuitry (e.g., processor 402, or the like) can execute.
- circuitry e.g., processor 402, or the like
- computer executable instructions 1002 can include instructions to implement operations described with respect to method 500, method 600, method 800, and/or method 1000.
- Examples of computer-readable storage medium 1100 or machine-readable storage medium may include any tangible media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth.
- Examples of computer executable instructions 1102 may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, object- oriented code, visual code, and the like.
- FIG. 12 illustrates a diagrammatic representation of a machine 1200 in the form of a computer system within which a set of instructions may be executed for causing the machine to perform any one or more of the methodologies discussed herein. More specifically, FIG. 12 shows a diagrammatic representation of the machine 1200 in the example form of a computer system, within which instructions 1208 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 1200 to perform any one or more of the methodologies discussed herein may be executed. For example, the instructions 1208 may cause the machine 1200 to execute method 500 of FIG. 5, method 600 of FIG. 6, method 800 of FIG. 8, and/or method 1000 of FIG. 10, or the like.
- instructions 1208 e.g., software, a program, an application, an applet, an app, or other executable code
- the instructions 1208 may cause the machine 1200 to synchronize lasing by a laser source with integration by an endoscope camera, synchronize integration by an endoscope camera with lasing by a laser source, or replace frames in captured images based on the timing of lasing by a laser source and integration by an endoscope camera.
- the instructions 1208 transform the general, non-programmed machine 1200 into a particular machine 1200 programmed to carry out the described and illustrated functions in a specific manner.
- the machine 1200 operates as a standalone device or may be coupled (e.g., networked) to other machines.
- the machine 1200 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine 1200 may comprise, but not be limited to, a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a PDA, an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 1208, sequentially or otherwise, that specify actions to be taken by the machine 1200.
- the term “machine” shall also be taken to include a collection of machines 200 that individually or jointly execute the instructions 1208 to perform any one or more of the methodologies discussed herein.
- the machine 1200 may include processors 1202, memory 1204, and I/O components 1242, which may be configured to communicate with each other such as via a bus 1244.
- the processors 1202 e.g., a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an ASIC, a Radio-Frequency Integrated Circuit (RFIC), another processor, or any suitable combination thereof
- the processors 1202 may include, for example, a processor 1206 and a processor 1210 that may execute the instructions 1208.
- processor is intended to include multi-core processors that may comprise two or more independent processors (sometimes referred to as “cores”) that may execute instructions contemporaneously.
- FIG. 12 shows multiple processors 1202, the machine 1200 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core processor), multiple processors with a single core, multiple processors with multiples cores, or any combination thereof.
- the memory 1204 may include a main memory 1212, a static memory 1214, and a storage unit 1216, both accessible to the processors 1202 such as via the bus 1244.
- the main memory 1204, the static memory 1214, and storage unit 1216 store the instructions 1208 embodying any one or more of the methodologies or functions described herein.
- the instructions 1208 may also reside, completely or partially, within the main memory 1212, within the static memory 1214, within machine- readable medium 1218 within the storage unit 1216, within at least one of the processors 1202 (e.g., within the processor’s cache memory), or any suitable combination thereof, during execution thereof by the machine 1200.
- the I/O components 1242 may include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on.
- the specific I/O components 1242 that are included in a particular machine will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that the I/O components 1242 may include many other components that are not shown in FIG. 12.
- the I/O components 1242 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various example embodiments, the I/O components 1242 may include output components 1228 and input components 1230.
- the output components 1228 may include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor, resistance mechanisms), other signal generators, and so forth.
- a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)
- acoustic components e.g., speakers
- haptic components e.g., a vibratory motor, resistance mechanisms
- the input components 1230 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point-based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or another pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location and/or force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.
- alphanumeric input components e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components
- point-based input components e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or another pointing instrument
- tactile input components e.g., a physical button,
- the I/O components 1242 may include biometric components 1232, motion components 1234, environmental components 1236, or position components 1238, among a wide array of other components.
- the biometric components 1232 may include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram-based identification), and the like.
- the motion components 1234 may include acceleration sensor components (e.g., accelerometer), gravitation sensor components, rotation sensor components (e.g., gyroscope), and so forth.
- the environmental components 1 36 may include, for example, illumination sensor components (c.g., photometer), temperature sensor components (c.g., one or more thermometers that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., gas detection sensors to detection concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment.
- illumination sensor components c.g., photometer
- temperature sensor components c.g., one or more thermometers that detect ambient temperature
- humidity sensor components e.g., pressure sensor components (e.g., barometer)
- the position components 1238 may include location sensor components (e.g., a GPS receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like.
- location sensor components e.g., a GPS receiver component
- altitude sensor components e.g., altimeters or barometers that detect air pressure from which altitude may be derived
- orientation sensor components e.g., magnetometers
- the I/O components 1242 may include communication components 1240 operable to couple the machine 1200 to a network 1220 or devices 1222 via a coupling 1224 and a coupling 1226, respectively.
- the communication components 1240 may include a network interface component or another suitable device to interface with the network 1220.
- the communication components 1240 may include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components (e.g., Bluetooth® Low Energy), Wi-Fi® components, and other communication components to provide communication via other modalities.
- the devices 1222 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a USB).
- the communication components 1240 may detect identifiers or include components operable to detect identifiers.
- the communication components 1240 may include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCodc, PDF417, Ultra Code, UCC RSS-2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals).
- a variety of information may be derived via the communication components 1240, such as location via Internet Protocol (IP) geolocation, location via Wi-Fi® signal triangulation, location via detecting an NFC beacon signal that may indicate a particular location, and so forth.
- IP Internet Protocol
- Wi-Fi® Wireless Fidelity
- the various memories i.e., memory 1204, main memory 1212, static memory 1214, and/or memory of the processors 1202 and/or storage unit 1216 may store one or more sets of instructions and data structures (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. These instructions (e.g., the instructions 1208), when executed by processors 1202, cause various operations to implement the disclosed embodiments.
- machine- storage medium means the same thing and may be used interchangeably in this disclosure.
- the terms refer to a single or multiple storage devices and/or media (e.g., a centralized or distributed database, and/or associated caches and servers) that store executable instructions and/or data.
- the terms shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media, including memory internal or external to processors.
- machine- storage media examples include non-volatile memory, including by way of example semiconductor memory devices, e.g., erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), FPGA, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
- semiconductor memory devices e.g., erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), FPGA, and flash memory devices
- magnetic disks such as internal hard disks and removable disks
- magneto-optical disks magneto-optical disks
- CD-ROM and DVD-ROM disks examples include CD-ROM and DVD-ROM disks.
- one or more portions of the network 1220 may be an ad hoc network, an intranet, an extranet, a VPN, a LAN, a WLAN, a WAN, a WWAN, a MAN, the Internet, a portion of the Internet, a portion of the PSTN, a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a Wi-Fi® network, another type of network, or a combination of two or more such networks.
- POTS plain old telephone service
- the network 1220 or a portion of the network 1220 may include a wireless or cellular network
- the coupling 1224 may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or another type of cellular or wireless coupling.
- CDMA Code Division Multiple Access
- GSM Global System for Mobile communications
- the coupling 1224 may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (IxRTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standard, others defined by various standard- setting organizations, other long range protocols, or other data transfer technology.
- IxRTT Single Carrier Radio Transmission Technology
- GPRS General Packet Radio Service
- EDGE Enhanced Data rates for GSM Evolution
- 3GPP Third Generation Partnership Project
- 4G fourth generation wireless (4G) networks
- High Speed Packet Access HSPA
- WiMAX Worldwide Interoperability for Microwave Access
- LTE Long Term
- the instructions 1208 may be transmitted or received over the network 1220 using a transmission medium via a network interface device (e.g., a network interface component included in the communication components 1240) and utilizing any one of several well-known transfer protocols (e.g., hypertext transfer protocol (HTTP)). Similarly, the instructions 1208 may be transmitted or received using a transmission medium via the coupling 1226 (e.g., a peer-to-peer coupling) to the devices 1222.
- a network interface device e.g., a network interface component included in the communication components 1240
- HTTP hypertext transfer protocol
- the instructions 1208 may be transmitted or received using a transmission medium via the coupling 1226 (e.g., a peer-to-peer coupling) to the devices 1222.
- the terms “transmission medium” and “signal medium” mean the same thing and may be used interchangeably in this disclosure.
- transmission medium and “signal medium” shall be taken to include any intangible medium that can store, encoding, or carrying the instructions 1208 for execution by the machine 1200, and includes digital or analog communications signals or other intangible media to facilitate communication of such software.
- transmission medium and “signal medium” shall be taken to include any form of modulated data signal, carrier wave, and so forth.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a matter as to encode information in the signal.
- references to "one embodiment” or “an embodiment” do not necessarily refer to the same embodiment, although they may.
- the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” Words using the singular or plural number also include the plural or singular number respectively, unless expressly limited to one or multiple ones.
- the words “herein,” “above,” “below” and words of similar import when used in this application, refer to this application as a whole and not to any portions of this application.
- references to "one embodiment” or “an embodiment” do not necessarily refer to the same embodiment, although they may.
- the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” Words using the singular or plural number also include the plural or singular number respectively, unless expressly limited to one or multiple ones.
- the words “herein,” “above,” “below” and words of similar import when used in this application, refer to this application as a whole and not to any portions of this application.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Radiology & Medical Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Signal Processing (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Multimedia (AREA)
- Endoscopes (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
La présente divulgation concerne un dispositif de lithotripsie laser endoscopique et des procédés pour un tel dispositif où l'acquisition d'image, l'effet laser, l'éclairage et/ou le traitement d'image sont configurés pour atténuer des distorsions dans l'image capturée résultant du laser.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363510751P | 2023-06-28 | 2023-06-28 | |
| US60/510,751 | 2023-06-28 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2025006759A2 true WO2025006759A2 (fr) | 2025-01-02 |
| WO2025006759A3 WO2025006759A3 (fr) | 2025-02-20 |
Family
ID=93940201
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/035835 Pending WO2025006759A2 (fr) | 2023-06-28 | 2024-06-27 | Atténuation d'interférence de caméra endoscopique |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025006759A2 (fr) |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190191975A1 (en) * | 2017-12-27 | 2019-06-27 | Ethicon Llc | Fluorescence imaging in a light deficient environment |
| US11712155B2 (en) * | 2019-06-20 | 2023-08-01 | Cilag GmbH Intenational | Fluorescence videostroboscopy of vocal cords |
| US11805323B2 (en) * | 2021-04-01 | 2023-10-31 | Brillnics Singapore Pte. Ltd. | Solid-state imaging device, method for driving solid-state imaging device, and electronic apparatus |
-
2024
- 2024-06-27 WO PCT/US2024/035835 patent/WO2025006759A2/fr active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025006759A3 (fr) | 2025-02-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9750393B2 (en) | Location of fragments during lithotripsy | |
| US11019327B2 (en) | Endoscope employing structured light providing physiological feature size measurement | |
| US20250366718A1 (en) | System and method for eye tracking | |
| WO2016072059A1 (fr) | Système d'endoscope, dispositif de traitement d'image, procédé de traitement d'image et programme | |
| KR20150033572A (ko) | 모바일 기기를 위한 추적 광학 시스템 | |
| US9345391B2 (en) | Control device, endoscope apparatus, aperture control method, and information storage medium | |
| CN104364606A (zh) | 摄像装置和摄像方法 | |
| KR20220121248A (ko) | 대상의 신체 부위에 수행된 치료 동작에 대한 피드백 제공 | |
| EP3768144A1 (fr) | Endoscope utilisant une lumière structurée fournissant une mesure de taille de caractéristiques physiologiques | |
| CN109196520A (zh) | 生物特征识别装置、方法和电子设备 | |
| EP3582681B1 (fr) | Système et procédé destinés à être utilisés dans la détection à distance | |
| CN205539525U (zh) | 一种摄像头自动查找系统 | |
| EP3554338B1 (fr) | Détermination de contour de surface oculaire au moyen d'une kératométrie multifocale | |
| WO2025006759A2 (fr) | Atténuation d'interférence de caméra endoscopique | |
| US20070263908A1 (en) | Iris Authentication Device | |
| KR102462975B1 (ko) | 인공지능 기반의 자궁경부암 검진 서비스 시스템 | |
| JP2012107944A (ja) | 対象識別装置及び対象識別方法 | |
| JP2019033971A (ja) | 内視鏡装置 | |
| CN104605819A (zh) | 用于近红外静脉血管显像点对点投影的装置及方法 | |
| US20250000333A1 (en) | Endoscope sensor interference mitigation | |
| CN207037684U (zh) | 一种3d虹膜采集装置 | |
| JP6994793B1 (ja) | 脱毛装置及び照射位置補正方法 | |
| CN110197161B (zh) | 静脉识别方法及相关产品 | |
| WO2025072588A1 (fr) | Systèmes pour prévenir l'émission laser à l'intérieur d'un endoscope | |
| KR20250154472A (ko) | 수술용 레이저를 위한 빔 회절 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24746093 Country of ref document: EP Kind code of ref document: A2 |