WO2017122541A1 - Dispositif de traitement d'image, procédé de traitement d'image, programme et système chirurgical - Google Patents
Dispositif de traitement d'image, procédé de traitement d'image, programme et système chirurgical Download PDFInfo
- Publication number
- WO2017122541A1 WO2017122541A1 PCT/JP2016/089036 JP2016089036W WO2017122541A1 WO 2017122541 A1 WO2017122541 A1 WO 2017122541A1 JP 2016089036 W JP2016089036 W JP 2016089036W WO 2017122541 A1 WO2017122541 A1 WO 2017122541A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- monitor
- image
- image processing
- display
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/00048—Constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/92—Dynamic range modification of images or parts thereof based on global image properties
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00193—Optical arrangements adapted for stereoscopic vision
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
- A61B2090/306—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/371—Surgical systems with images on a monitor during operation with simultaneous use of two cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/372—Details of monitor hardware
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/373—Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20182—Noise reduction or smoothing in the temporal domain; Spatio-temporal filtering
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
Definitions
- the present technology relates to an image processing device, an image processing method, a program, and a surgical system, and particularly, for example, an image processing device, an image processing method, a program, and an image processing device that can display an appropriate surgical site image. It relates to a surgical system.
- Patent Document 1 a reference image obtained by reducing a medical image is generated, the number of reference images that can be displayed is calculated according to the monitor size, and only the number of display images that can be displayed are arranged on the display screen of the monitor.
- a technique for displaying a reference image in a uniform size without depending on the monitor size is described.
- the visibility of the image displayed on the monitor that is, how the image quality is perceived varies depending on the viewing distance and the monitor size.
- a monitor In a medical field, for example, in an operating room, a monitor is often installed within a predetermined range. Therefore, the monitor size greatly affects the visibility of an image displayed on the monitor.
- an appropriate surgical part image having good visibility for an operator or the like is displayed on a monitor as a display device that displays the surgical part image showing the surgical part.
- the present technology has been made in view of such a situation, and makes it possible to display an appropriate surgical part image.
- the image processing apparatus or the program according to an embodiment of the present technology includes a control unit that controls to perform correction processing on the surgical site image based on information related to display on the display device that displays the surgical site image in which the surgical site is displayed.
- An image processing method is an image processing method including controlling to perform a correction process on an operation part image based on information on display of a display device that displays an operation part image in which an operation part is displayed. is there.
- the surgical operation system is based on information related to display of an endoscope that captures an image and a display device that displays an image of the surgical site in which the surgical site is captured and is captured by the endoscope.
- a surgical operation system that includes a control unit that performs control to perform correction processing and the display device that displays a corrected post-operative image after performing the correction processing.
- correction processing is performed on the surgical site image based on information related to display on the display device that displays the surgical site image in which the surgical site is reflected. It is controlled to do.
- the image processing apparatus may be an independent apparatus or an internal block constituting one apparatus.
- the program can be provided by being transmitted through a transmission medium or by being recorded on a recording medium.
- FIG. 6 is a diagram illustrating an example of storage of a plurality of image processing parameters in a parameter storage unit 52.
- FIG. It is a figure explaining the relationship between the monitor size of a monitor, and the contrast sensitivity with respect to the contrast of the image displayed on the monitor. It is a figure explaining the relationship between the monitor size of a monitor, and the noise sensitivity with respect to the noise of the image displayed on the monitor. It is a figure explaining the example of NR processing.
- FIG. 6 is a block diagram illustrating a first configuration example of a monitor size estimation unit 61.
- FIG. 10 is a block diagram illustrating a third configuration example of a monitor size estimation unit 61.
- 10 is a block diagram illustrating a fourth configuration example of a monitor size estimation unit 61.
- FIG. It is a figure explaining the estimation of the depth of each pixel of the reference
- FIG. It is a block diagram which shows the 4th structural example of CCU31. It is a block diagram which shows the 5th structural example of CCU31. It is a block diagram which shows the 6th structural example of CCU31.
- 3 is a block diagram illustrating a configuration example of an image processing method determination unit 121.
- FIG. And FIG. 18 is a block diagram illustrating a configuration example of an embodiment of a computer to which the present technology is applied.
- FIG. 1 is a diagram illustrating a configuration example of an embodiment of an endoscopic surgery system to which the present technology is applied.
- the endoscopic surgery system of FIG. 1 is installed in an operating room, and an abdominal surgery or the like is performed by, for example, endoscopic surgery instead of conventional laparotomy using this endoscopic surgery system. .
- a patient bed 11 is arranged in the operating room, and a patient 12 lies on the patient bed 11.
- the trocar 13, 14, 15, 16 as an opening device is attached with several small holes (four in FIG. 1) in the abdominal wall. It has been.
- the trocars 13 to 16 are provided with holes (not shown), through which surgical tools (for example, an endoscope, an energy treatment tool such as an electric knife, forceps, etc.) used for surgery are provided to the patient 12. Inserted into the body.
- surgical tools for example, an endoscope, an energy treatment tool such as an electric knife, forceps, etc.
- an endoscope 21 is inserted from the trocar 13
- an insufflation needle 22 is inserted from the trocar 14
- an energy treatment device 23 is inserted from the trocar
- a forceps 24 is inserted from the trocar 16.
- Each of the endoscope 21, the energy treatment tool 23, and the forceps 24 is held by, for example, an operator, an assistant, a scoopist, or a robot.
- an endoscopic surgery system includes an endoscope 21, an insufflation needle 22, an energy treatment tool 23, a forceps 24, a cart 30, a CCU (Camera Control Unit) 31, a light source device 32, a treatment tool device 33, It has an insufflation apparatus 34, a recorder 35, a printer 36, a monitor 37, and a foot switch 38.
- an operation part image that is imaged by the endoscope 21 and shows an affected part (tumor or the like) 12A as an operation part to be operated is displayed on the monitor 37.
- the surgeon performs a treatment such as excision of the affected area 12A with the energy treatment tool 23 or the like while viewing the operation part image displayed on the monitor 37 in real time.
- the endoscope 21 has a camera (photographing device), that is, a camera head including an image sensor or the like and an observation optical system (both not shown), and photographs an image.
- a camera photographing device
- a camera head including an image sensor or the like and an observation optical system (both not shown), and photographs an image.
- the endoscope 21 is a laparoscope, and illuminates the affected area 12A, the periphery of the affected area 12A, and the like by irradiating light supplied from the light source device 32 via the light guide cable 32A.
- the endoscope 21 receives the reflected light of the light irradiated by the endoscope 21 by the image sensor of the camera head via the observation optical system, and thereby the surgical part where the surgical part such as the affected part 12A is reflected. Take a picture.
- the endoscope 21 supplies the surgical part image to the CCU 31 via the camera cable 31A.
- the pneumoperitoneum 22 supplies gas (for example, air or carbon dioxide) supplied from the pneumoperitoneum 34 to the abdomen around the affected area 12A in the body of the patient 12, and the gas in the body of the patient 12 Is a needle for sucking the air into the pneumoperitoneum 34.
- gas for example, air or carbon dioxide
- the energy treatment instrument 23 is a surgical instrument that uses electrical energy, such as an electric knife that cuts the affected part 12A with electric heat.
- the forceps 24 is a surgical instrument for grasping tissue or the like in the living body.
- the cart 30 is equipped with devices as medical equipment constituting the endoscopic surgery system as necessary.
- a CCU 31 or a monitor 37 is mounted on the cart 30.
- the CCU 31 controls the camera head of the endoscope 21 via the camera cable 31A, thereby adjusting, for example, focus, aperture, exposure time, and the like.
- the CCU 31 operates from the endoscope 21 on the basis of information related to display on the monitor 37 as a display device that displays an image (such as an operation part image) supplied from the endoscope 21 via the camera cable 31A.
- Various processes are controlled so that the correction process is performed on the partial image.
- the CCU 31 performs image processing (correction) obtained by performing (applying) image processing as correction processing on an image (such as a surgical site image) supplied from the endoscope 21 via the camera cable 31A.
- image processing corrected
- a postoperative part image or the like is supplied to the monitor 37.
- the endoscope 21 and the CCU 31 are connected via a wired camera cable 31A, but the endoscope 21 and the CCU 31 can also be connected wirelessly.
- the light source device 32 is connected to the endoscope 21 via a light guide cable 32A.
- the light source device 32 switches and emits light of various wavelengths as necessary, and supplies the light to the endoscope 21 via the light guide cable 32A.
- the treatment instrument device 33 is a high-frequency output device that supplies a high-frequency current to the energy treatment instrument 23.
- the insufflation apparatus 34 has an insufflation unit and an inhalation unit (both not shown), and performs insufflation and inhalation through the insufflation needle 22.
- the recorder 35 records the surgical part image etc. which were image
- the printer 36 prints an operation part image taken by the endoscope 21.
- the monitor 37 is composed of, for example, an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) panel, and is a display device that displays an image supplied from the CCU 31.
- LCD Liquid Crystal Display
- organic EL Electro Luminescence
- the foot switch 38 is operated, for example, by an operator, an assistant, or the like, and supplies an operation signal (trigger signal) corresponding to the foot operation to the CCU 31 or the treatment instrument device 33, whereby the CCU 31 or the treatment instrument device. 33 is controlled.
- the camera which the endoscope 21 has may be a single-lens camera (monocular camera) or a multi-lens camera having two or more eyes such as a stereo camera.
- a multi-view camera such as a stereo camera is employed as the camera of the endoscope 21
- a 3D (Dimensional) image can be displayed on the monitor 37 as an operation part image.
- FIG. 1 only one monitor 37 is provided as a monitor for displaying the surgical site image. However, as a monitor for displaying the surgical site image, a plurality of monitors, that is, the monitor 37 is provided. In addition, one or more monitors can be provided.
- the monitors may have the same size or different sizes.
- the same image captured by the endoscope 21 can be displayed on a plurality of monitors.
- each monitor can display a separate image taken by each endoscope.
- FIG. 2 is a block diagram showing a first configuration example of the CCU 31 of FIG.
- the CCU 31 includes a UI (User Interface) 41, an image processing parameter determination unit 42, and an image processing unit 43.
- UI User Interface
- the UI 41 is operated by, for example, an operator, an assistant, a scopist, or the like as a user who uses the endoscopic surgery system of FIG.
- the user can input the monitor size of the monitor 37 by operating the UI 41, and the UI 41 supplies the monitor size input by the user operation to the image processing parameter determination unit 42.
- the image processing parameter determination unit 42 acquires the monitor size from the UI 41.
- the image processing parameter determination unit 42 determines image processing parameters used for image processing in the image processing unit 43 according to the monitor size from the UI 41 and supplies the image processing parameters to the image processing unit 43.
- the image processing unit 43 is supplied with an image processing parameter from the image processing parameter determination unit 42 and an operation part image (image data) showing the operation part from the endoscope 21.
- the image processing unit 43 uses the image processing parameters from the image processing parameter determination unit 43 to perform correction processing on the surgical site image, which is image processing of the surgical site image from the endoscope 21, thereby performing image processing ( Correction processing) Image processing corresponding to the monitor size of the monitor 37 that displays the post-operative image is performed on the surgical image. Then, the image processing unit 43 causes the monitor 37 to display the operation part image (image data) after the image processing.
- FIG. 3 is a block diagram illustrating a configuration example of the image processing parameter determination unit 42 in FIG.
- the image processing parameter determination unit 42 includes a monitor size acquisition unit 51, a parameter storage unit 52, and a determination unit 53.
- the monitor size acquisition unit 51 acquires, for example, the monitor size supplied from the UI 41 in FIG.
- the parameter storage unit 52 stores correction processing (information) appropriate for displaying the surgical part image on each monitor in association with information related to the display of various monitors such as the monitor 37 that displays the surgical part image. Yes.
- the parameter storage unit 52 for example, sets a plurality of (set) image processing parameters as the image processing parameters used in the image processing (correction processing) performed by the image processing unit 43 in FIG. Information).
- the determination unit 53 determines image processing parameters used for image processing of the image processing unit 43 from a plurality of image processing parameters stored in the parameter storage unit 52 according to the monitor size supplied from the monitor size acquisition unit 51. And supplied to the image processing unit 43 as the attention parameter.
- FIG. 4 is a diagram illustrating an example of storage of a plurality of image processing parameters in the parameter storage unit 52 of FIG.
- the parameter storage unit 52 stores image processing parameters appropriate for each monitor size in association with each of the plurality of monitor sizes.
- NR Noise Reduction
- emphasis for enhancing an arbitrary portion of the image There are processing.
- the enhancement processing includes, for example, edge enhancement processing typified by unsharp mask processing, band enhancement processing for emphasizing a specific frequency band, and the like. Since doctors tend to prefer image quality that emphasizes a frequency band that is somewhat lower than the high frequency band, band emphasis processing may be performed in the CCU 31 of the endoscopic surgery system.
- parameters for NR processing and parameters for edge enhancement processing are stored as image processing parameters.
- the parameter storage unit 52 can store a parameter for band enhancement processing instead of a parameter for NR processing or a parameter for edge enhancement processing.
- the parameter storage unit 52 can store a parameter for band enhancement processing in addition to a parameter for NR processing and a parameter for edge enhancement processing. Further, the parameter storage unit 52 can store any one of a parameter for NR processing, a parameter for edge enhancement processing, and a parameter for band enhancement.
- the type of image processing parameter to be stored in the parameter storage unit 52 is determined according to the type of image processing performed by the image processing unit 43. That is, in the image processing unit 43, when one or more of NR processing, edge enhancement processing, band enhancement processing, and the like can be performed as image processing, the parameter storage unit 52 includes the image processing unit 43. In accordance with the image processing performed in step 1, the parameters for the image processing are stored.
- the parameter storage unit 52 can store image processing parameters for all monitor sizes of monitors that can be connected to the CCU 31.
- the determination unit 53 uses an image processing parameter associated with the monitor size of the monitor 37 among the plurality of image processing parameters stored in the parameter storage unit 52 for image processing in the image processing unit 43. Determine the parameter of interest.
- the parameter storage unit 52 can store image processing parameters for some (plural) monitor sizes among all monitor sizes connectable to the CCU 31.
- the determination unit 53 corresponds to the monitor size of the monitor 37.
- the attached image processing parameter is determined as the attention parameter.
- the determination unit 53 is stored in the parameter storage unit 52.
- the image processing parameter associated with the monitor size closest to the monitor size of the monitor 37 can be determined as the parameter of interest.
- the parameter of interest can be determined by interpolation using the image processing parameters.
- the image processing performed by the image processing unit 43 is not limited to NR processing, edge enhancement processing, and band enhancement processing.
- the image processing unit 43 performs, for example, NR processing and edge enhancement processing as image processing.
- FIG. 5 is a diagram for explaining the relationship between the monitor size of the monitor and the contrast sensitivity with respect to the contrast of the image displayed on the monitor.
- FIG. 6 is a diagram for explaining the relationship between the monitor size of the monitor and the noise sensitivity (noise evaluation value) with respect to the noise of the image displayed on the monitor.
- FIG. 5 is a quote from IEICE “Knowledge Base”, Knowledge Forest, Chapter 5, Visual system frequency characteristics, IEICE 2010, and FIG. 6 is Aoyama et.al, “Heterogeneous Image”. This is quoted from “Noise Evaluation Method for Output Devices”, Journal of the Japan Photography Society 1964, p392.
- the distance from the viewer who views the image to the monitor on which the image is displayed is called viewing distance.
- noise sensitivity the degree to which noise can be seen in the image displayed on the monitor, that is, the degree to which the viewer feels noise in the image displayed on the monitor.
- contrast sensitivity the degree to which the viewer feels contrast in the image displayed on the monitor.
- FIG. 5 shows the relationship between the monitor spatial frequency (horizontal axis) and contrast sensitivity (vertical axis).
- the spatial frequency of the monitor When the viewing distance is fixed, as shown in FIG. 5, between the spatial frequency of the monitor and the contrast sensitivity, the spatial frequency of the monitor is below a predetermined value (for example, about 10 c / deg (cycles / degree)) or less. In range, the greater the spatial frequency, the greater the contrast sensitivity.
- a predetermined value for example, about 10 c / deg (cycles / degree)
- a small monitor spatial frequency corresponds to a large monitor size
- a large monitor spatial frequency corresponds to a small monitor size
- the contrast sensitivity tends to increase as the monitor size decreases (the spatial frequency increases).
- FIG. 6 shows the relationship between the viewing distance (horizontal axis) and the noise evaluation value (vertical axis) representing how the noise appears.
- the noise evaluation value tends to increase as the viewing distance decreases.
- a small viewing distance corresponds to a large monitor size when the viewing distance is constant. Conversely, a large viewing distance corresponds to a small monitor size when the viewing distance is constant.
- the noise evaluation value corresponds to noise sensitivity that represents the degree to which the viewer feels noise in the image displayed on the monitor.
- the noise sensitivity tends to increase as the monitor size increases (viewing distance decreases).
- the NR process of the image processing unit 43 is performed with a high intensity NR process to further reduce noise.
- the monitor size when the monitor size is large, the area per pixel of the monitor becomes large, so that it is easy to visually recognize ringing caused by edge enhancement processing in addition to noise. Therefore, when the monitor size is large, it is desirable to suppress the occurrence of ringing by performing edge enhancement processing with low intensity as edge enhancement processing of the image processing 43.
- the edge enhancement processing of the image processing unit 43 is performed by performing edge enhancement processing with high intensity to increase contrast and improve visibility.
- the edge enhancement processing with a large intensity is performed when the monitor size is large, the contrast becomes too strong, and there is a possibility that the eye fatigue of the surgeon who views the surgical part image is increased.
- the parameter storage unit 52 performs NR processing with high intensity when the monitor size is large, and performs NR processing with low intensity when the monitor size is small.
- the parameters for NR processing are stored in association with the monitor size.
- the parameter storage unit 52 performs edge enhancement processing with low intensity, and when the monitor size is small, edge enhancement processing with high intensity is performed. As described above, the edge enhancement parameters are stored in association with the monitor size.
- FIG. 7 is a diagram for explaining an example of NR processing.
- filtering by a bilateral filter can be employed.
- the pixel value of the pixel of interest before filtering by the bilateral filter is represented as f (i, j) with the i-th pixel from the left of the surgical part image and the j-th pixel from the top as the pixel of interest
- the pixel value after filtering is represented as g (i, j).
- filtering of the pixel of interest by the bilateral filter is performed using pixel values of m ⁇ n pixels in the horizontal and vertical directions centered on the pixel of interest according to the formula shown in FIG. g (i, j) is obtained.
- Expression as filtering by bilateral filter includes a parameter sigma 1 2 and sigma 2 2, as a parameter for the NR process, it is possible to adopt this parameter sigma 1 2 and sigma 2 2.
- the image processing parameter determining unit 42 the monitor size is large
- the parameters ⁇ 1 2 and ⁇ 2 2 having large values can be determined as the attention parameters used for the NR process.
- the image after filtering by the bilateral filter has a lower contrast than the image before filtering (original image).
- the degree to which the contrast is reduced increases as the strength of the NR process increases.
- filtering by the bilateral filter is an NR process for removing noise, and can also be referred to as a contrast reduction process for reducing contrast or an edge suppression process for suppressing edges.
- the image processing unit 43 When filtering by the bilateral filter is regarded as contrast reduction processing or edge suppression processing, the image processing unit 43 performs contrast reduction processing or edge suppression processing with a larger intensity as the monitor size is larger. It can be said.
- FIG. 8 is a diagram for explaining an example of edge enhancement processing.
- an unsharp mask process can be employed.
- the pixel value of the target pixel before unsharp mask processing is represented as f (i, j), and the pixel value after unsharp mask processing is represented as g (i, j).
- the unsharp mask process is performed using pixel values of m ⁇ n pixels in the horizontal and vertical directions centered on the target pixel according to the formula shown in FIG. (i, j) is required.
- the expression for unsharp mask processing includes parameters k, m, and n, and these parameters k, m, and n can be used as parameters for edge enhancement processing.
- the edge enhancement processing strength (enhancement effect) is increased, and when the parameter k, m, n is decreased, the edge enhancement processing strength is decreased.
- the image processing parameter determination unit 42 has a small monitor size.
- the parameters k, m, and n having large values can be determined as attention parameters used for edge enhancement processing.
- FIG. 9 is a flowchart for explaining an example of processing of the CCU 31 of FIG.
- step S11 the image processing parameter determination unit 42 waits for a user such as an operator to input the monitor size of the monitor 37 by operating the UI 41, and acquires the monitor size. Proceed to S12.
- step S ⁇ b> 12 the image processing parameter determination unit 42 determines an image processing parameter used for image processing in the image processing unit 43 as an attention parameter according to the monitor size acquired from the UI 41, and supplies it to the image processing unit 43. Then, the process proceeds to step S13.
- step S ⁇ b> 13 the image processing unit 43 performs image processing of the surgical part image supplied from the endoscope 21 using the image processing parameter as the attention parameter from the image processing parameter determination unit 43, thereby the monitor 37.
- the image processing according to the monitor size is performed on the surgical site image.
- the image processing unit 43 displays the surgical part image after the image processing by supplying it to the monitor 37.
- the image processing unit 43 performs the image processing with the intensity corresponding to the monitor size of the monitor 37 on the surgical site image. It is possible to provide a surgical image with high image quality.
- monitor size when the monitor size is large, for example, an NR process with a large intensity is performed and an edge enhancement process with a small intensity is performed.
- monitor size when the monitor size is small, for example, NR processing with low intensity is performed and edge enhancement processing with high intensity is performed.
- the monitor size when the monitor size is large, it is possible to suppress a reduction in the visibility of the surgical part image due to conspicuous noise and ringing. Furthermore, when the monitor size is small, the contrast of the surgical part image can be increased and the visibility can be improved.
- the surgeon can reduce the accumulation of fatigue caused by viewing the surgical part image in a long-time operation.
- the intensity of the image processing of the image processing unit 43 is the viewing distance (the distance between the monitor 37 and the operator who views the surgical part image displayed on the monitor 37) in addition to the monitor size. Can be determined in consideration of
- the parameter storage unit 52 has a certain distance as a reference viewing distance, and a plurality of image processing parameters suitable for a plurality of monitor sizes with respect to the reference viewing distance. It can be memorized.
- the determination unit 53 obtains an image processing parameter corresponding to the monitor size of the monitor 37 from the plurality of image processing parameters stored in the parameter storage unit 52, and the image processing parameter is obtained from the actual viewing distance and the reference viewing / listening. Correction can be made according to the difference from the processing.
- the parameter storage unit 52 can store image processing parameters appropriate for each combination of a plurality of viewing distances and a plurality of monitor sizes.
- the determination unit 53 can obtain an image processing parameter corresponding to the actual viewing distance and the monitor size of the monitor 37 from the plurality of image processing parameters stored in the parameter storage unit 52.
- the UI 41 is operated to have the user input it, or the monitor shooting distance described later can be used as the viewing distance.
- image processing parameters for performing image processing with appropriate intensity for each monitor size (and viewing distance) can be obtained by simulation or experiment, for example.
- the image processing intensity (and hence the image processing parameter) of the image processing unit 43 can be fixed to the intensity corresponding to the lower limit value when the monitor size is equal to or smaller than the predetermined lower limit value.
- the image processing intensity of the image processing unit 43 can be fixed to an intensity corresponding to the upper limit value when the monitor size is equal to or smaller than a predetermined upper limit value.
- image processing parameters can be set for each user. That is, for example, for a certain user A, the parameter storage unit 52 uses the image processing parameter associated with the monitor size of the monitor 37, and for the other user B, the parameter storage unit 52 monitors the monitor 37. It is possible to use image processing parameters associated with the monitor size one level above or one size below.
- the CCU 31 performs image processing as correction processing for the surgical site image according to the monitor size (display screen size). Other than this, it can be performed according to (based on) information related to the display of the monitor 37 as a display device for displaying an operation part image (hereinafter also referred to as display related information).
- the display related information includes, for example, the brightness of the monitor 37 (image displayed on it), the resolution (display resolution), and the like in addition to the monitor size.
- the brighter the brightness the less the pixel difference becomes visible. Therefore, when the brightness of the monitor 37 is bright, the intensity of the NR process can be reduced, and the intensity of the edge enhancement process can be increased. On the other hand, when the brightness of the monitor 37 is dark, the strength of the NR process can be increased, and the strength of the edge enhancement process can be decreased.
- the brightness of the monitor 37 can be acquired, for example, from other setting information of metadata set in the monitor 37 or by using an illuminance sensor.
- the resolution of the monitor 37 can be obtained from setting information set in the monitor 37, for example.
- the correction process performed on the surgical part image can be performed based on the display related information of the monitor 37, the display related information, and the use status of the monitor 37.
- the usage status of the monitor 37 includes the brightness of the place where the monitor 37 is installed, the viewing distance of the user who views (views) the monitor 37 (the image displayed on it), the viewing time, and the like.
- the CCU 31 can change the intensity of the NR process or the edge enhancement process as the correction process according to the display-related information and the viewing time as the usage status of the monitor 37 or the length of the estimated surgery time. .
- the CCU 31 can increase the strength of the NR process and decrease the strength of the edge enhancement process as the viewing time or the estimated operation time is longer. Further, in the CCU 31, the shorter the viewing time or the estimated surgery time, the smaller the strength of the NR processing and the higher the strength of the edge enhancement processing.
- the CCU 31 can change the intensity of the NR process or the edge enhancement process as the correction process according to the display-related information and the brightness of the surgical light as the usage state of the monitor 37.
- the CCU 31 can increase the strength of the NR process and decrease the strength of the edge enhancement process as the brightness of the surgical light is darker.
- the brightness of the surgical light can be detected using, for example, an illuminometer.
- the display-related information on the monitor 37 and the usage status of the monitor 37 are displayed. They can be considered with equal weights or with different weights.
- NR processing contrast reduction processing
- edge enhancement processing edge enhancement processing
- band enhancement processing image processing as correction processing for an operation part image is performed, for example, contrast adjustment is performed by tone curve correction or histogram smoothing.
- contrast adjustment process for adjusting the parallax and a parallax adjustment process for adjusting the parallax of the stereo image can be employed.
- the strength of processing can be increased.
- the parallax adjustment processing when the surgical part image is a stereo image, for example, the larger the monitor size, the smaller the intensity of the parallax adjustment processing (performs a process for reducing or reducing the parallax so much) The intensity of the parallax adjustment process can be increased as the monitor size is smaller.
- the monitor size is used as display-related information, and the case where NR processing and edge enhancement processing are used as correction processing will be described as an example.
- FIG. 10 is a block diagram showing a second configuration example of the CCU 31 of FIG.
- the CCU 31 includes an image processing parameter determination unit 42 and an image processing unit 43.
- the CCU 31 of FIG. 10 is different from the case of FIG. 2 in that the UI 41 is not provided in common with the case of FIG. 2 in that the CCU 31 includes the image processing parameter determination unit 42 and the image processing unit 43. .
- the monitor size acquisition unit 51 communicates with the monitor 37 to acquire the monitor size of the monitor 37 from the monitor 37.
- the monitor 37 stores the monitor size of the monitor 37 as metadata.
- the monitor 37 transmits the monitor size as metadata to the monitor size acquisition unit 51 by performing wired communication or wireless communication with the monitor size acquisition unit 51 of the image processing parameter determination unit 42.
- the monitor size acquisition unit 51 acquires the monitor size transmitted as metadata from the monitor 37 and supplies it to the determination unit 53 (FIG. 3).
- the monitor size acquisition unit 51 acquires the monitor size of the monitor 37 by communicating with the monitor 37, for example, a user such as an operator saves the trouble of inputting the monitor size. be able to.
- FIG. 11 is a block diagram showing a third configuration example of the CCU 31 of FIG.
- the CCU 31 includes an image processing parameter determination unit 42, an image processing unit 43, and a monitor size estimation unit 61.
- the CCU 31 of FIG. 11 is common to the case of FIG. 2 in that it has an image processing parameter determination unit 42 and an image processing unit 43, and in that it has a monitor size estimation unit 61 instead of the UI 41. This is different from the case of 2.
- the monitor photographed image (image data) obtained by photographing the monitor 37 is supplied to the monitor size estimating unit 61.
- the photographing of the monitor 37 can be performed by any device having a photographing function.
- the endoscopic surgery system of FIG. 1 includes an endoscope 21 having a photographing function, so that a monitor 37 can be used by the endoscope 21 without separately preparing a device having a photographing function (although it may be prepared). Can be taken.
- a user such as a surgeon photographs the monitor 37 with the endoscope 21 before the start of surgery, for example.
- a monitor photographed image obtained by photographing the monitor 37 with the endoscope 21 is supplied to the monitor size estimating unit 61.
- the monitor image can be taken by a camera other than the endoscope 21, that is, for example, a camera installed in the operating room such as an operating field camera.
- the monitor size estimation unit 61 estimates the monitor size of the monitor 37 shown in the monitor photographed image from the monitor photographed image supplied from the endoscope 21 and supplies it to the image processing parameter determination unit 42.
- FIG. 12 is a block diagram illustrating a first configuration example of the monitor size estimation unit 61 in FIG.
- the monitor size estimation unit 61 includes a monitor frame detection unit 71 and a monitor size conversion unit 72.
- the monitor captured image is supplied from the endoscope 21 to the monitor frame detection unit 71.
- the monitor frame detection unit 71 detects a monitor frame that is an outer peripheral portion of the monitor 37 shown in the monitor photographed image from the endoscope 21. Furthermore, the monitor frame detection unit 71 indicates the size of the monitor frame, that is, the number of pixels of the monitor 37 reflected in the monitor image, for example, the horizontal and vertical number of pixels of the substantially rectangular monitor frame. And supplied to the monitor size conversion unit 72.
- the monitor size conversion unit 72 is a monitor size table (not shown) that associates the monitor frame size with the monitor size when the monitor of each monitor size is photographed at a predetermined distance away from the monitor. )).
- the monitor size conversion unit 72 converts the monitor frame size supplied from the monitor frame detection unit 71 into a monitor size corresponding to the size of the monitor frame by referring to the monitor size table. This is output as the monitor size estimation result of the monitor 37.
- FIG. 13 is a diagram illustrating an example of processing of the monitor size estimation unit 61 in FIG.
- the CCU 31 displays, on the monitor 37 or the like, a message that prompts the endoscope 21 to photograph the monitor 37 at a predetermined distance from the monitor 37 before the start of surgery.
- a user such as a surgeon photographs the monitor 37 with the endoscope 21 at a predetermined distance from the monitor 37 in accordance with a message displayed on the monitor 37.
- a monitor photographed image obtained by photographing the monitor 37 with the endoscope 21 is supplied to the monitor size estimating unit 61.
- the monitor frame detection unit 71 performs, for example, edge detection on the monitor photographed image from the endoscope 21 to monitor the monitor 37 displayed on the monitor photographed image. Detect a frame. Further, the monitor frame detection unit 71 detects the number of pixels as the size of the monitor frame and supplies it to the monitor size conversion unit 72.
- the monitor size conversion unit 72 converts the size of the monitor frame from the monitor frame detection unit 71 into a monitor size by referring to the monitor size table, and outputs the result as an estimation result of the monitor size of the monitor 37.
- the monitor size table is created in advance, and the shooting of the monitor 37 by the user is away from the monitor 37 by a predetermined distance. It is necessary to have you go from.
- FIG. 14 is a block diagram showing a second configuration example of the monitor size estimation unit 61 in FIG.
- the monitor size estimation unit 61 includes a monitor frame detection unit 71, a UI 81, and a monitor size conversion unit 82.
- the monitor size estimation unit 61 of FIG. 14 is common to the case of FIG. 12 in that it has a monitor frame detection unit 71.
- the monitor size estimation unit 61 of FIG. 14 is provided with a monitor size conversion unit 82 instead of the monitor size conversion unit 72 and a UI 81 is newly provided. Is different.
- the UI 81 is operated by a user such as an operator, for example.
- the user can input the monitor shooting distance when shooting the monitor 37 by operating the UI 81, and the UI 81 supplies the monitor shooting distance input by the user operation to the monitor size conversion unit 82. .
- a user such as an operator photographs the monitor 37 with the endoscope 21 before the start of surgery.
- the user needs to take an image of the monitor 37 at a position away from the monitor 37 by a predetermined distance, but in the case of FIG.
- the monitor 37 can be photographed at an arbitrary position.
- the user captures from the monitor photographing distance when photographing the monitor 37, that is, the position of the user when the user photographs the monitor 37 (more precisely, the position of the endoscope 21).
- the distance to the monitor 37 is input by operating the UI 81.
- the monitor size conversion unit 82 is a distance / monitor size table that associates the monitor frame size with the monitor size when the monitor of each monitor size is photographed at a distance. (Not shown).
- the monitor size conversion unit 82 refers to the distance / monitor size table, and converts the monitor shooting distance supplied from the UI 81 and the monitor frame size supplied from the monitor frame detection unit 71 into the monitor shooting distance and the monitor.
- the monitor size is converted into a monitor size corresponding to the frame size, and the monitor size is output as a monitor size estimation result of the monitor 37.
- the user needs to input the monitor shooting distance by operating the UI 81, but can shoot the monitor 37 at an arbitrary monitor shooting distance.
- FIG. 15 is a block diagram illustrating a third configuration example of the monitor size estimation unit 61 of FIG.
- the monitor size estimation unit 61 includes a monitor frame detection unit 71, a monitor size conversion unit 82, a depth estimation unit 91, and a monitor shooting distance estimation unit 92.
- the monitor size estimation unit 61 of FIG. 15 is common to the case of FIG. 14 in that it has a monitor frame detection unit 71 and a monitor size conversion unit 82.
- the monitor size estimation unit 61 in FIG. 15 is different from the case in FIG. 14 in that a depth estimation unit 91 and a monitor shooting distance estimation unit 92 are provided instead of the UI 81.
- the endoscope 21 has, for example, a stereo camera as a multi-view camera, and a stereo image composed of L (Left) images and R (Right) images of two different viewpoints is captured on a monitor. Output as an image.
- the depth estimation unit 91 is supplied with a monitor photographed image that is a stereo image from the endoscope 21.
- the depth estimation unit 91 detects a parallax with the reference image for each pixel of the standard image, with one of the L image and the R image constituting the stereo image as the standard image and the other as the reference image.
- the depth estimation unit 91 estimates (determines) the depth (direction distance) of each pixel of the reference image, that is, the depth of the subject reflected in each pixel of the reference image, from the parallax of each pixel of the reference image. This is supplied to the shooting distance estimation unit 92.
- the monitor frame detection unit 71 detects the size of the monitor frame from the stereo image as the monitor photographed image from the endoscope 21, and the detection of the monitor frame constitutes the stereo image. This is performed for the reference image of the L and R images.
- the monitor frame detection unit 71 detects the size of the monitor frame and also coordinates of the monitor frame, that is, the pixels (part or all) of the pixels constituting the monitor frame (and the inside of the monitor frame). The coordinates are detected as monitor coordinates and supplied to the monitor photographing distance estimation unit 92.
- the monitor shooting distance estimation unit 92 selects the depth of the pixel at the monitor coordinate position from the monitor frame detection unit 71, that is, the monitor frame (and the monitor frame) from the depth of each pixel of the reference image from the depth estimation unit 91. Detects the depth of the pixel that shows the inside.
- the monitor shooting distance estimation unit 92 estimates the monitor shooting distance from the depth of the pixel in which the monitor frame is reflected, and supplies the monitor shooting distance to the monitor size conversion unit 82.
- the monitor photographing distance estimation unit 92 for example, the depth of any one pixel among the depths of the pixels constituting the monitor frame, or the average value, the mode value, and the minimum value of the depths of the pixels constituting the monitor frame.
- the maximum value or the like is obtained (estimated) as the monitor shooting distance and supplied to the monitor size conversion unit 82.
- the monitor size conversion unit 82 refers to the monitor / photograph distance estimation unit 92 and the monitor frame detection unit 71 by referring to the distance / monitor size table.
- the monitor frame size is converted into a monitor size corresponding to the monitor shooting distance and the monitor frame size, and the monitor size is output as a monitor size estimation result of the monitor 37.
- the user can shoot the monitor 37 at any monitor shooting distance without inputting the monitor shooting distance.
- the depth estimation unit 91 estimates the depth of all the pixels of the reference image.
- the depth estimation unit 91 uses, for example, the monitor frame (and the monitor frame) among the pixels of the reference image. It is possible to obtain the depth of only the pixels in which the image of (inside) is reflected and the pixels in the vicinity of the pixels.
- the monitor coordinates are supplied from the monitor frame detection unit 71 to the depth estimation unit 91 instead of the monitor shooting distance estimation unit 92, and the depth estimation unit 91 selects the monitor frame of the pixels of the reference image. Can be obtained and the depth of only pixels in the vicinity of the pixel can be obtained and supplied to the monitor photographing distance estimation unit 92.
- the monitor shooting distance estimation unit 92 estimates the monitor shooting distance from the pixels of the monitor frame supplied from the monitor frame detection unit 71 and the depths of the pixels in the vicinity of the pixels.
- FIG. 16 is a block diagram illustrating a fourth configuration example of the monitor size estimation unit 61 of FIG.
- the monitor size estimation unit 61 includes a monitor frame detection unit 71, a monitor size conversion unit 82, a monitor shooting distance estimation unit 92, and a depth estimation unit 101.
- the monitor size estimation unit 61 of FIG. 16 is common to the case of FIG. 15 in that it includes a monitor frame detection unit 71, a monitor size conversion unit 82, and a monitor shooting distance estimation unit 92.
- the monitor size estimation unit 61 of FIG. 16 is different from the case of FIG. 15 in that a depth estimation unit 101 is provided instead of the depth estimation unit 91.
- the endoscope 21 has a stereo camera, and a stereo image is output as a monitor photographed image.
- the endoscope 21 has a monocular camera. Then, an image of one viewpoint is output as a monitor photographed image.
- the endoscope 21 (the camera head thereof) has a movement amount detection function for detecting the movement amount of the monocular camera of the endoscope 21 such as a gyro sensor, for example.
- the movement amount of the monocular camera detected by the detection function is supplied to the depth estimation unit 101 as an endoscope movement amount.
- the depth estimation unit 101 is supplied with the amount of movement of the endoscope from the endoscope 21, and also from the endoscope 21, one viewpoint as a monitor photographed image photographed with a monocular camera. Images are supplied.
- the user can shoot the monitor 37 at an arbitrary monitor shooting distance, as in the case of FIGS. 14 and 15.
- the user performs photographing on the monitor 37 from different photographing positions at different timings.
- the depth estimation unit 101 is supplied with the above-described two viewpoint images from the endoscope 21 as a monitor photographed image.
- the depth estimation unit 101 is a monocular camera included in the endoscope 21 when a two-viewpoint image is captured as a monitor photographed image from the endoscope 21 based on the amount of endoscope movement from the endoscope 21. The amount of movement (vector) is obtained.
- the depth estimation unit 101 uses one of the two viewpoint images as the monitor photographed image from the endoscope 21 as a reference image, and sets the other as a reference image, for each pixel of the reference image, The parallax is detected using the movement amount of the monocular camera.
- the depth estimation unit 101 estimates the depth of each pixel of the reference image (the depth of the subject reflected in each pixel of the reference image) from the parallax of each pixel of the reference image, and supplies it to the monitor shooting distance estimation unit 92.
- the monitor frame size and the monitor coordinates are detected by the monitor frame detection unit 71 as in the case of FIG.
- FIG. 17 is a diagram illustrating the estimation of the depth of each pixel of the reference image in the depth estimation unit 101 in FIG.
- the user has photographed the monitor 37 with the endoscope 21.
- the depth estimation unit 101 can detect the parallax and thus the depth using the movement amount v in the same manner as the depth estimation unit 91 of FIG.
- the monitor size estimation unit 61 in FIG. 16 even when the endoscope 21 has a monocular camera instead of a multi-lens camera such as a stereo camera, the user can input the monitor shooting distance.
- the monitor size of the monitor 37 can be estimated.
- the depth of the subject reflected in the pixels of the monitor photographed image is detected (estimated) using, for example, focus information of the endoscope 21 as a camera for photographing the monitor 37, for example. )can do.
- the depth of the subject reflected in the pixels of the monitor photographed image can be detected using a distance sensor such as a ToF (Time-of-Flight) sensor.
- a distance sensor such as a ToF (Time-of-Flight) sensor.
- ToF Time-of-Flight
- light is emitted from the light emitting element, and reflected light obtained by reflecting the light on the subject is received by the light receiving element.
- the distance to the subject is determined according to the time from when the light is emitted from the light emitting element to when the reflected light is received by the light receiving element.
- the detection of the monitor 37 shown in the monitor photographed image is performed on the edge detection for the monitor photographed image, the edge detection, and the pixel of the monitor photographed image. This can be done using the depth of the subject.
- the detection of the monitor 37 shown in the monitor photographed image may be performed by registering color information representing the color of the monitor frame in advance and using the color information and edge detection for the monitor photographed image. it can.
- the width of the peak area of the integrated value of edge detection (for example, the integrated value of the primary differential value or the secondary differential value obtained by edge detection) is set.
- the monitor size can be estimated by estimating the angle (shooting angle when the monitor 37 is shot) and taking the shooting angle into consideration.
- the monitor and monitor size used in each surgery can be registered in advance.
- the monitor size can be obtained from the monitor size registered in advance without performing estimation (detection).
- FIG. 18 is a block diagram showing a fourth configuration example of the CCU 31 of FIG.
- the CCU 31 includes a UI 41 and an image processing parameter determination unit 42.
- the CCU 31 in FIG. 18 is different from the case in FIG. 2 in that the CCU 31 has the UI 41 and the image processing parameter determination unit 42 in common with the case in FIG. 2 and the image processing unit 43 is not provided. .
- the monitor 37 includes an image processing unit 43 and an image display unit 111.
- the image display unit 111 is supplied with a surgical part image after image processing from the image processing unit 43.
- the image display unit 111 is a block that controls the original display function of the monitor 37, and displays a surgical part image supplied from the image processing unit 43.
- the surgical part image supplied from the endoscope 21 is supplied to the image processing unit 43 of the monitor 37.
- the image processing parameter determined by the image processing parameter determination unit 42 is supplied to the image processing unit 43 of the monitor 37.
- the image processing unit 43 uses the image processing parameters from the image processing parameter determination unit 43 and the image of the surgical part image supplied from the endoscope 21 via the CCU 31. By performing the processing, image processing corresponding to the monitor size of the monitor 37 is performed on the surgical site image. Then, the image processing unit 43 causes the image display unit 111 to display the surgical part image after the image processing.
- the image processing unit 43 can be provided not on the CCU 31 but on the monitor 37.
- the image processing unit 43 can be provided in both the CCU 31 and the monitor 37 in addition to being provided in the CCU 31 or the monitor 37.
- the CCU 31 can perform some image processing such as NR processing
- the monitor 37 can perform remaining image processing such as enhancement processing.
- the image processing parameter determination unit 42 selects the image processing unit 43 from the plurality of image processing parameters stored in the parameter storage unit 52 (FIG. 3) according to the monitor size of the monitor 37.
- An image processing parameter (attention parameter) used for image processing is determined and supplied to the image processing unit 43 of the monitor 37.
- it is stored in the parameter storage unit 52 (FIG. 3) of the image processing parameter determination unit 42.
- a plurality of image processing parameters (tables) can be supplied to the monitor 37.
- an image corresponding to the monitor size of the monitor 37 is selected from among a plurality of image processing parameters (stored in the parameter storage unit 52) supplied from the CCU 31. Processing parameters can be determined.
- the CCU 31 can be configured without providing the UI 41 for the user to input the monitor size.
- the CCU 31 is provided with the UI 41 for the user to input the monitor size.
- the monitor size is transmitted from the monitor 37 to the CCU 31 as described in FIG. Or, as described with reference to FIG. 11, it can be estimated from a monitor photographed image obtained by photographing the monitor 37.
- the image processing parameter determination unit 42 can be provided not on the CCU 31 but on the monitor 37.
- FIG. 19 is a block diagram showing a fifth configuration example of the CCU 31 of FIG.
- the CCU 31 includes a UI 41, an image processing parameter determination unit 42, and an image processing unit 43.
- the CCU 31 in FIG. 19 is configured in the same manner as in FIG.
- FIG. 19 for example, two monitors 37 and 37 ⁇ / b> A as a plurality of monitors are connected to the CCU 31.
- the user inputs the monitor sizes of the monitor 37 as the first monitor and the monitor 37A as the second monitor by operating the UI 41, and the UI 41 uses the monitor 37 input by the user's operation.
- the monitor size (hereinafter also referred to as the first monitor size) and the monitor size of the monitor 37A (hereinafter also referred to as the second monitor size) are supplied to the image processing parameter determination unit 42.
- the image processing parameter determination unit 42 acquires the first monitor size and the second monitor size from the UI 41.
- the image processing parameter determination unit 42 determines an image processing parameter (hereinafter also referred to as a first monitor image processing parameter) used for image processing in the image processing unit 43 in accordance with the first monitor size from the UI 41. To the image processing unit 43.
- an image processing parameter hereinafter also referred to as a first monitor image processing parameter
- the image processing parameter determination unit 42 determines an image processing parameter (hereinafter also referred to as a second monitor image processing parameter) used for image processing in the image processing unit 43 according to the second monitor size from the UI 41. To the image processing unit 43.
- an image processing parameter hereinafter also referred to as a second monitor image processing parameter
- the image processing unit 43 uses the first monitor image processing parameter from the image processing parameter determination unit 43 to perform image processing of the surgical unit image from the endoscope 21, thereby performing the surgical unit image after the image processing. Image processing corresponding to the first monitor size of the monitor 37 that displays is performed on the surgical part image. Then, the image processing unit 43 outputs, as a first monitor image (image data), a surgical part image after image processing performed using the first monitor image processing parameter.
- the first monitor image is supplied to and displayed on a monitor 37 serving as a first monitor.
- the image processing unit 43 performs image processing of the surgical part image from the endoscope 21 using the second monitor image processing parameter from the image processing parameter determination unit 43, thereby performing the operation after the image processing.
- Image processing corresponding to the second monitor size of the monitor 37A for displaying the partial image is performed on the surgical part image.
- the image processing unit 43 outputs, as a second monitor image (image data), a surgical part image after the image processing performed using the second monitor image processing parameter.
- the second monitor image is supplied to and displayed on a monitor 37A as a second monitor.
- the monitor 37 serving as the first monitor displays an operation part image having an image quality suitable for the first monitor size of the monitor 37. Furthermore, on the monitor 37A as the second monitor, an operation part image having an image quality suitable for the second monitor size of the monitor 37A is displayed.
- the surgical part image is displayed on each of a plurality of monitors like the two monitors 37 and 37A
- the user who views the surgical part image displayed on the monitor 37 is also displayed on the monitor 37A.
- the accumulation of fatigue caused by viewing the surgical part image can be reduced to the same extent.
- FIG. 19 two monitors 37 and 37 ⁇ / b> A are connected to the CCU 31, but three or more monitors can be connected to the CCU 31.
- the same (contents) surgical part image is displayed on the two monitors 37 and 37A.
- different surgical part images can be displayed on the two monitors 37 and 37A. It can. That is, for example, an operation part image photographed by the endoscope 21 is displayed on one of the two monitors 37 and 37A, and another endoscope (not shown) is displayed on the other monitor. It is possible to display the surgical part image taken with.
- the image processing unit 43 can be provided in each of the monitors 37 and 37A, or can be provided in each of the CCU 31 and the monitors 37 and 37A, as in the case described with reference to FIG.
- the CCU 31 is provided with the UI 41 for the user to input the monitor size.
- the monitor size may be transmitted from the monitor 37 to the CCU 31 as described in FIG. 10, for example. Or, as described with reference to FIG. 11, it can be estimated from a monitor photographed image obtained by photographing the monitor 37.
- FIG. 20 is a block diagram showing a sixth configuration example of the CCU 31 of FIG.
- the CCU 31 includes a UI 41, an image processing method determination unit 121, and an image processing unit 122.
- the CCU 31 in FIG. 20 has a UI 41 and is the same as that in FIG. 2, and instead of the image processing parameter determination unit 42 and the image processing unit 43, the image processing method determination unit 121 and the image processing 2 is different from the case of FIG. 2 in that the portions 122 are provided.
- the image processing method determination unit 121 acquires the monitor size from the UI 41.
- the image processing method determination unit 121 determines an image processing method for image processing in the image processing unit 122 according to the monitor size from the UI 41, and supplies the image processing method to the image processing unit 122.
- the image processing unit 122 is an image processing method from the image processing method determination unit 121, and performs image processing of the surgical part image from the endoscope 21, thereby displaying the surgical part image after the image processing. Image processing corresponding to the monitor size is performed on the surgical site image. Then, the image processing unit 122 displays the surgical part image after the image processing by supplying it to the monitor 37.
- image processing corresponding to the monitor size of the monitor 37 is performed by image processing using an image processing parameter corresponding to the monitor size, or by image processing using an image processing method corresponding to the monitor size. it can.
- the intensity of the image processing performed on the surgical part image can be adjusted by changing the image processing method in addition to changing the image processing parameter without changing the image processing method.
- the correction process that is the image process performed on the surgical part image includes an NR process, an edge enhancement process, a contrast adjustment process, a parallax adjustment process, and the like.
- NR processing (image) processing methods include, for example, a processing method using a bilateral filter, a processing method using a Gaussian filter, and a processing method using a median filter.
- an edge enhancement processing method for example, there is an unsharp mask.
- contrast adjustment processing there are a processing method using tone curve correction, a processing method using histogram smoothing, and the like.
- the process method of the correction process is as follows: be changed.
- the processing method of the NR processing is changed to a processing method in which the strength of the NR processing increases as the monitor size increases, and the strength of the NR processing decreases as the monitor size decreases.
- the processing method of the edge enhancement processing is changed to a processing method in which the strength of the edge enhancement processing decreases as the monitor size increases, and the strength of the edge enhancement processing increases as the monitor size decreases.
- the processing method of the contrast adjustment processing can be changed to a processing method in which the strength of the contrast adjustment processing decreases as the monitor size increases, and the strength of the contrast adjustment processing increases as the monitor size decreases.
- the processing method of the parallax adjustment processing can be changed to a processing method in which the intensity of the parallax adjustment processing becomes smaller as the monitor size is larger, and the strength of the parallax adjustment processing becomes larger as the monitor size is smaller.
- one monitor 37 is connected to the CCU 31, but a plurality of monitors can be connected to the CCU 31 as in the case of FIG. 19.
- the image processing unit 122 can be provided in the CCU 31 and can be provided in the monitor 37 as well as in the CCU 31 and the monitor 37 as in the case described with reference to FIG. 18.
- the CCU 31 is provided with the UI 41 for the user to input the monitor size.
- the monitor size may be transmitted from the monitor 37 to the CCU 31 as described in FIG. Or, as described with reference to FIG. 11, it can be estimated from a monitor photographed image obtained by photographing the monitor 37.
- FIG. 21 is a block diagram illustrating a configuration example of the image processing method determination unit 121 in FIG.
- the image processing parameter determination unit 121 includes a monitor size acquisition unit 131, an image processing method storage unit 132, and a determination unit 133.
- the monitor size acquisition unit 131 acquires, for example, the monitor size supplied from the UI 41 in FIG. 20 and supplies the monitor size to the determination unit 133.
- the image processing method storage unit 132 stores correction information (information) appropriate for displaying the surgical part image on each monitor in association with information on display of various monitors such as the monitor 37 that displays the surgical part image. is doing.
- the image processing method storage unit 132 stores a plurality of image processing methods (information representing) of image processing (correction processing) that can be performed by the image processing unit 122 in FIG. Information) is stored in association with each other.
- the image processing method storage unit 132 stores, for each of a plurality of monitor sizes, the monitor size and an image processing method in which image processing with an intensity appropriate for the monitor size is performed in association with each other.
- the determination unit 133 determines the image processing method of the image processing of the image processing unit 122 from the plurality of image processing methods stored in the image processing method storage unit 132 according to the monitor size supplied from the monitor size acquisition unit 131. And supplied to the image processing unit 122.
- a series of processes such as the image processing parameter determination unit 42, the image processing unit 43, the monitor size estimation unit 61, the image processing method determination unit 121, and the image processing unit 122 described above may be performed by hardware. It can also be done by software. When a series of processing is performed by software, a program constituting the software is installed in a general-purpose computer or the like.
- FIG. 22 is a block diagram illustrating a configuration example of an embodiment of a computer in which a program for executing the series of processes described above is installed.
- the program can be recorded in advance in a hard disk 205 or ROM 203 as a recording medium built in the computer.
- the program can be stored (recorded) in the removable recording medium 211.
- a removable recording medium 211 can be provided as so-called package software.
- examples of the removable recording medium 211 include a flexible disk, a CD-ROM (Compact Disc Read Only Memory), a MO (Magneto Optical) disc, a DVD (Digital Versatile Disc), a magnetic disc, and a semiconductor memory.
- the program can be installed on the computer from the removable recording medium 211 as described above, or downloaded to the computer via a communication network or a broadcast network, and installed on the built-in hard disk 205. That is, the program is transferred from a download site to a computer wirelessly via a digital satellite broadcasting artificial satellite, or wired to a computer via a network such as a LAN (Local Area Network) or the Internet. be able to.
- a network such as a LAN (Local Area Network) or the Internet.
- the computer incorporates a CPU (Central Processing Unit) 202, and an input / output interface 210 is connected to the CPU 202 via the bus 201.
- a CPU Central Processing Unit
- the CPU 202 executes a program stored in a ROM (Read Only Memory) 203 according to the command. .
- the CPU 202 loads a program stored in the hard disk 205 into a RAM (Random Access Memory) 204 and executes it.
- the CPU 202 performs processing according to the flowchart described above or processing performed by the configuration of the block diagram described above. Then, the CPU 202 outputs the processing result as necessary, for example, via the input / output interface 210, from the output unit 206, or from the communication unit 208, and further recorded in the hard disk 205.
- the input unit 207 includes a keyboard, a mouse, a microphone, and the like.
- the output unit 206 includes an LCD (Liquid Crystal Display), a speaker, and the like.
- the processing performed by the computer according to the program does not necessarily have to be performed in chronological order in the order described as the flowchart. That is, the processing performed by the computer according to the program includes processing executed in parallel or individually (for example, parallel processing or object processing).
- the program may be processed by one computer (processor), or may be distributedly processed by a plurality of computers. Furthermore, the program may be transferred to a remote computer and executed.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
- an endoscopic operation system that performs an endoscopic operation as shown in FIG. 1
- an electron microscope medical microscope
- the present invention can be applied to any device that displays any image.
- the intensity of the image processing performed on the surgical part image can be adjusted by changing both the image processing parameter and the image processing method.
- this technique can take the following structures.
- An image processing apparatus comprising: a control unit that controls to perform correction processing on the surgical part image based on information related to display on a display device that displays the surgical part image in which the surgical part is displayed.
- the display-related information includes a display screen size, a display resolution, and brightness of the display apparatus.
- the control unit performs control so as to perform correction processing on the surgical part image based on information related to the display and a usage state of the display device.
- ⁇ 4> The image processing apparatus according to ⁇ 3>, wherein the usage status includes brightness of a display device installation place, viewing distance and viewing time of a person viewing the display device.
- ⁇ 5> The image processing apparatus according to any one of ⁇ 1> to ⁇ 4>, wherein the control unit controls the intensity of the correction processing based on information related to the display.
- the correction process is a noise removal process, an edge enhancement process, a contrast adjustment process, and a parallax adjustment process.
- ⁇ 7> The image processing apparatus according to any one of ⁇ 1> to ⁇ 6>, wherein the control unit controls a processing method of the correction processing based on information related to the display.
- a storage unit that stores correction processing appropriate for the display in association with the information about the display
- the image processing apparatus according to any one of ⁇ 1> to ⁇ 6>, wherein the control unit controls the correction process to be performed on the surgical part image based on the correction process stored in the storage unit.
- the control unit further controls to acquire information related to the display.
- the control unit acquires information related to the display from the display device.
- the controller is From the captured image obtained by photographing the display screen of the display device, the display screen size is estimated, The image processing apparatus according to ⁇ 2>, wherein the correction processing is controlled to be performed on the surgical part image based on the display screen size.
- the captured image is an image of at least two viewpoints captured by a multi-view camera, The image processing device according to ⁇ 12>, wherein the control unit estimates the shooting distance from the images of the two viewpoints.
- ⁇ 14> The image processing apparatus according to ⁇ 12>, wherein the control unit estimates the shooting distance from focus information of the shot image.
- the control unit controls the correction processing to be performed based on information on the display of each of the plurality of display devices when the operation part image is displayed on the plurality of display devices.
- ⁇ 16> The image processing apparatus according to any one of ⁇ 1> to ⁇ 15>, wherein the control unit further controls to display the corrected post-operative image after performing the correction process on the display device.
- An image processing method comprising: controlling to perform correction processing on the surgical part image based on information related to display on a display device that displays the surgical part image showing the surgical part.
- a surgical system comprising: the display device that displays a post-corrected surgical part image after performing the correction process.
- the control unit controls to perform the correction processing based on information on the display of each of the plurality of display devices.
- An image processing apparatus comprising: an image processing unit that performs image processing with an intensity corresponding to a monitor size of a monitor that displays an operation part image on which an operation part is reflected, on the operation part image.
- the image processing unit according to ⁇ O1> wherein the image processing unit performs one or both of noise removal processing and enhancement processing as the image processing.
- the image processing unit The larger the monitor size, the greater the noise removal processing,
- ⁇ O5> The image processing apparatus according to ⁇ O4>, wherein the monitor size acquisition unit acquires the monitor size input by a user operating a UI (User Interface).
- UI User Interface
- ⁇ O6> The image processing device according to ⁇ O4>, wherein the monitor size acquisition unit acquires the monitor size transmitted from the monitor.
- ⁇ O7> The image processing apparatus according to any one of ⁇ O1> to ⁇ O3>, further comprising a monitor size estimation unit that estimates the monitor size from a monitor photographed image obtained by photographing the monitor.
- ⁇ O8> The image processing apparatus according to ⁇ O7>, wherein the monitor size estimation unit estimates the monitor size based on the number of pixels on the monitor photographed image among the pixels of the monitor photographed image.
- the monitor size estimation unit estimates the monitor size based on the number of pixels that the monitor shows out of the pixels of the monitor photographed image and the photographing distance when the monitor is photographed ⁇ O7>
- the monitor photographed image is an image of at least two viewpoints photographed by a multi-view camera, The image processing device according to ⁇ O9>, wherein the monitor size estimation unit estimates the shooting distance from the images of the two viewpoints.
- the monitor photographed image is an image of two viewpoints photographed at different timings from different photographing positions, The image processing device according to ⁇ O9>, wherein the monitor size estimation unit estimates the shooting distance from the images of the two viewpoints.
- the image processing unit performs image processing with an intensity corresponding to each of a plurality of monitor sizes, and outputs the surgical part image after image processing for each of the plurality of monitor sizes.
- ⁇ O1> to ⁇ O11> The image processing apparatus described.
- ⁇ O13> The image processing unit performs image processing using a parameter determined according to the monitor size, thereby performing image processing with an intensity according to the monitor size.
- ⁇ O1> to ⁇ O12> The image processing apparatus described.
- ⁇ O14> The image processing unit performs image processing with an image processing method determined according to the monitor size, thereby performing image processing with an intensity according to the monitor size ⁇ O1> to ⁇ O12> The image processing apparatus described.
- ⁇ O15> An image processing method for performing image processing with an intensity corresponding to a monitor size of a monitor that displays a surgical part image showing a surgical part on the surgical part image.
- ⁇ O16> A program for causing a computer to function as an image processing unit that performs image processing with an intensity corresponding to a monitor size of a monitor that displays a surgical unit image on which a surgical unit is displayed.
- ⁇ O17> An endoscope for taking images; An image processing unit that performs image processing of an intensity according to a monitor size of a monitor that displays an operation part image in which an operation part is captured, which is captured by the endoscope, on the operation part image;
- a surgical system comprising: the monitor that displays the surgical site image after image processing by the image processing unit.
- ⁇ O18> The surgical operation system according to ⁇ O17>, further comprising a monitor size estimation unit that estimates the monitor size from a monitor photographed image obtained by photographing the monitor with the endoscope.
- the endoscope captures an image with a multi-lens camera
- the monitor size estimation unit Estimating the shooting distance when the monitor was shot from the two-point monitor shot image shot by the multi-lens camera
- the surgical operation system according to ⁇ O18> wherein the monitor size is estimated based on the number of pixels on the monitor photographed image and the photographing distance when the monitor is photographed.
- ⁇ O20> Further comprising a CCU (Camera Control Uni) for controlling the camera head of the endoscope,
- CCU Camera Control Uni
- the surgery system according to any one of ⁇ O17> to ⁇ O19>, wherein the image processing unit is provided in the CCU, the monitor, or both the CCU and the monitor.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Endoscopes (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Geometry (AREA)
Abstract
L'invention concerne un dispositif de traitement d'image, un procédé de traitement d'image, un programme et un système chirurgical, qui permettent l'affichage d'une image de site chirurgical appropriée. Une unité de commande commande de telle sorte qu'un processus de correction pour un site chirurgical est réalisé sur la base d'informations se rapportant à l'affichage d'un dispositif d'affichage qui affiche l'image de site chirurgical représentant le site chirurgical. Cette technologie peut être appliquée, par exemple, à un système chirurgical ou analogue qui affiche une image de site chirurgical capturée par un endoscope sur un moniteur.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017561581A JP6764574B2 (ja) | 2016-01-13 | 2016-12-28 | 画像処理装置、画像処理方法、プログラム、及び、手術システム |
| US15/772,560 US10614555B2 (en) | 2016-01-13 | 2016-12-28 | Correction processing of a surgical site image |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016004504 | 2016-01-13 | ||
| JP2016-004504 | 2016-01-13 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017122541A1 true WO2017122541A1 (fr) | 2017-07-20 |
Family
ID=59311028
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2016/089036 Ceased WO2017122541A1 (fr) | 2016-01-13 | 2016-12-28 | Dispositif de traitement d'image, procédé de traitement d'image, programme et système chirurgical |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US10614555B2 (fr) |
| JP (1) | JP6764574B2 (fr) |
| WO (1) | WO2017122541A1 (fr) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020054732A1 (fr) * | 2018-09-13 | 2020-03-19 | 富士フイルム株式会社 | Caméra fixée à une imprimante et son procédé de commande d'affichage |
| WO2022064998A1 (fr) * | 2020-09-23 | 2022-03-31 | 株式会社Aiメディカルサービス | Dispositif d'aide à l'examen, procédé d'aide à l'examen, et programme d'aide à l'examen |
| JP2024028512A (ja) * | 2020-10-02 | 2024-03-04 | Hoya株式会社 | プログラム、情報処理方法及び内視鏡システム |
| WO2024202094A1 (fr) * | 2023-03-27 | 2024-10-03 | オリンパスメディカルシステムズ株式会社 | Dispositif de traitement d'image médicale, procédé de traitement d'image médicale, et programme de traitement d'image médicale |
Families Citing this family (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2017212811A1 (fr) * | 2016-06-06 | 2017-12-14 | オリンパス株式会社 | Dispositif endoscopique |
| CN110325098A (zh) | 2016-11-28 | 2019-10-11 | 适内有限责任公司 | 具有可分离一次性轴的内窥镜 |
| CN110392546B (zh) * | 2017-03-07 | 2022-09-02 | 索尼公司 | 信息处理设备、辅助系统和信息处理方法 |
| US11426507B2 (en) * | 2017-08-21 | 2022-08-30 | RELIGN Corporation | Arthroscopic devices and methods |
| CN108364618B (zh) * | 2018-03-14 | 2021-01-01 | 京东方科技集团股份有限公司 | 移位寄存器单元及其驱动方法、栅极驱动电路、显示装置 |
| US11357593B2 (en) | 2019-01-10 | 2022-06-14 | Covidien Lp | Endoscopic imaging with augmented parallax |
| US11625825B2 (en) | 2019-01-30 | 2023-04-11 | Covidien Lp | Method for displaying tumor location within endoscopic images |
| USD1018844S1 (en) | 2020-01-09 | 2024-03-19 | Adaptivendo Llc | Endoscope handle |
| USD1051380S1 (en) | 2020-11-17 | 2024-11-12 | Adaptivendo Llc | Endoscope handle |
| USD1070082S1 (en) | 2021-04-29 | 2025-04-08 | Adaptivendo Llc | Endoscope handle |
| USD1031035S1 (en) | 2021-04-29 | 2024-06-11 | Adaptivendo Llc | Endoscope handle |
| USD1066659S1 (en) | 2021-09-24 | 2025-03-11 | Adaptivendo Llc | Endoscope handle |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001211325A (ja) * | 2000-01-28 | 2001-08-03 | Minolta Co Ltd | 画像処理装置 |
| JP2003024273A (ja) * | 2001-07-13 | 2003-01-28 | Olympus Optical Co Ltd | 画像処理装置 |
| JP2010187250A (ja) * | 2009-02-13 | 2010-08-26 | Fujitsu Ltd | 画像補正装置、画像補正プログラムおよび画像撮影装置 |
| JP2010279507A (ja) * | 2009-06-03 | 2010-12-16 | Hoya Corp | 電子内視鏡システム |
| JP2010279457A (ja) * | 2009-06-03 | 2010-12-16 | Hoya Corp | 電子内視鏡、電子内視鏡システムおよび色調整方法 |
| JP2013244044A (ja) * | 2012-05-23 | 2013-12-09 | Olympus Corp | 内視鏡システム、内視鏡装置及びプログラム |
| WO2014163109A1 (fr) * | 2013-04-03 | 2014-10-09 | オリンパスメディカルシステムズ株式会社 | Système endoscope destiné à afficher des images 3-d |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4182794B2 (ja) | 2003-04-02 | 2008-11-19 | コニカミノルタエムジー株式会社 | 医用画像表示方法及び医用画像表示システム |
| JP4086035B2 (ja) * | 2004-12-09 | 2008-05-14 | セイコーエプソン株式会社 | 自動画像補正回路 |
| US7907166B2 (en) * | 2005-12-30 | 2011-03-15 | Intuitive Surgical Operations, Inc. | Stereo telestration for robotic surgery |
| JP4886751B2 (ja) * | 2008-09-25 | 2012-02-29 | 株式会社東芝 | 車載用表示システム及び表示方法 |
| US8884735B2 (en) * | 2008-11-17 | 2014-11-11 | Roger Li-Chung | Vision protection method and system thereof |
| JP2011033707A (ja) * | 2009-07-30 | 2011-02-17 | Canon Inc | 画像処理装置及びその制御方法 |
| US8706184B2 (en) * | 2009-10-07 | 2014-04-22 | Intuitive Surgical Operations, Inc. | Methods and apparatus for displaying enhanced imaging data on a clinical image |
| US20110115766A1 (en) * | 2009-11-16 | 2011-05-19 | Sharp Laboratories Of America,Inc. | Energy efficient display system |
| JP4763827B2 (ja) * | 2009-11-26 | 2011-08-31 | 富士フイルム株式会社 | 立体画像表示装置、複眼撮像装置及び立体画像表示プログラム |
| US9066658B2 (en) * | 2010-03-23 | 2015-06-30 | Stryker Corporation | Method and system for video based image detection/identification analysis for fluid and visualization control |
| WO2012002106A1 (fr) * | 2010-06-30 | 2012-01-05 | 富士フイルム株式会社 | Dispositif d'affichage d'image tridimensionnelle, procédé d'affichage d'image tridimensionnelle, programme d'affichage d'image tridimensionnelle et support d'enregistrement |
| US8684914B2 (en) * | 2011-08-12 | 2014-04-01 | Intuitive Surgical Operations, Inc. | Image capture unit and an imaging pipeline with enhanced color performance in a surgical instrument and method |
| WO2013073028A1 (fr) * | 2011-11-16 | 2013-05-23 | 株式会社 東芝 | Dispositif de traitement d'image, dispositif d'affichage d'image tridimensionnelle, procédé de traitement d'image et programme de traitement d'image |
| JP5904281B2 (ja) * | 2012-08-10 | 2016-04-13 | 株式会社ニコン | 画像処理方法、画像処理装置、撮像装置および画像処理プログラム |
| US10568522B2 (en) * | 2013-10-23 | 2020-02-25 | The Trustees Of Dartmouth College | Surgical vision augmentation system |
| WO2015100310A1 (fr) * | 2013-12-23 | 2015-07-02 | Camplex, Inc. | Systèmes de visualisation chirurgicale |
| TWI564590B (zh) * | 2015-04-02 | 2017-01-01 | tai-guo Chen | Image can strengthen the structure of the glasses |
-
2016
- 2016-12-28 WO PCT/JP2016/089036 patent/WO2017122541A1/fr not_active Ceased
- 2016-12-28 JP JP2017561581A patent/JP6764574B2/ja not_active Expired - Fee Related
- 2016-12-28 US US15/772,560 patent/US10614555B2/en active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001211325A (ja) * | 2000-01-28 | 2001-08-03 | Minolta Co Ltd | 画像処理装置 |
| JP2003024273A (ja) * | 2001-07-13 | 2003-01-28 | Olympus Optical Co Ltd | 画像処理装置 |
| JP2010187250A (ja) * | 2009-02-13 | 2010-08-26 | Fujitsu Ltd | 画像補正装置、画像補正プログラムおよび画像撮影装置 |
| JP2010279507A (ja) * | 2009-06-03 | 2010-12-16 | Hoya Corp | 電子内視鏡システム |
| JP2010279457A (ja) * | 2009-06-03 | 2010-12-16 | Hoya Corp | 電子内視鏡、電子内視鏡システムおよび色調整方法 |
| JP2013244044A (ja) * | 2012-05-23 | 2013-12-09 | Olympus Corp | 内視鏡システム、内視鏡装置及びプログラム |
| WO2014163109A1 (fr) * | 2013-04-03 | 2014-10-09 | オリンパスメディカルシステムズ株式会社 | Système endoscope destiné à afficher des images 3-d |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020054732A1 (fr) * | 2018-09-13 | 2020-03-19 | 富士フイルム株式会社 | Caméra fixée à une imprimante et son procédé de commande d'affichage |
| CN112689795A (zh) * | 2018-09-13 | 2021-04-20 | 富士胶片株式会社 | 带打印机的照相机及其显示控制方法 |
| JPWO2020054732A1 (ja) * | 2018-09-13 | 2021-09-30 | 富士フイルム株式会社 | プリンタ付きカメラ及びその表示制御方法 |
| CN112689795B (zh) * | 2018-09-13 | 2022-06-14 | 富士胶片株式会社 | 带打印机的照相机及其显示控制方法 |
| JP7177165B2 (ja) | 2018-09-13 | 2022-11-22 | 富士フイルム株式会社 | プリンタ付きカメラ及びその表示制御方法 |
| US11995804B2 (en) | 2018-09-13 | 2024-05-28 | Fujifilm Corporation | Printer-equipped camera and displaying control method thereof |
| WO2022064998A1 (fr) * | 2020-09-23 | 2022-03-31 | 株式会社Aiメディカルサービス | Dispositif d'aide à l'examen, procédé d'aide à l'examen, et programme d'aide à l'examen |
| JP2024028512A (ja) * | 2020-10-02 | 2024-03-04 | Hoya株式会社 | プログラム、情報処理方法及び内視鏡システム |
| JP7562886B2 (ja) | 2020-10-02 | 2024-10-07 | Hoya株式会社 | プログラム、情報処理方法及び内視鏡システム |
| WO2024202094A1 (fr) * | 2023-03-27 | 2024-10-03 | オリンパスメディカルシステムズ株式会社 | Dispositif de traitement d'image médicale, procédé de traitement d'image médicale, et programme de traitement d'image médicale |
Also Published As
| Publication number | Publication date |
|---|---|
| US20190096037A1 (en) | 2019-03-28 |
| US10614555B2 (en) | 2020-04-07 |
| JPWO2017122541A1 (ja) | 2018-11-01 |
| JP6764574B2 (ja) | 2020-10-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6764574B2 (ja) | 画像処理装置、画像処理方法、プログラム、及び、手術システム | |
| US20210015343A1 (en) | Surgical assistance apparatus, surgical method, non-transitory computer readable medium and surgical assistance system | |
| CN107710756B (zh) | 医用立体观察装置、医用立体观察方法以及程序 | |
| US20190051039A1 (en) | Image processing apparatus, image processing method, program, and surgical system | |
| JP2015531271A (ja) | 外科用画像処理システム、外科用画像処理方法、プログラム、コンピュータ可読記録媒体、医用画像処理装置、および画像処理検査装置 | |
| CN108778143B (zh) | 用于将腹腔镜图像与超声图像进行叠加的计算设备 | |
| JPWO2018084003A1 (ja) | 医療用画像処理装置、医療用画像処理方法、プログラム | |
| US11481179B2 (en) | Information processing apparatus and information processing method | |
| JP5698068B2 (ja) | 画像処理装置、画像表示システム、放射線画像撮影システム、画像処理プログラム、及び画像処理方法 | |
| JP7517339B2 (ja) | 手術画像表示システム、画像処理装置、及び画像処理方法 | |
| US11446113B2 (en) | Surgery support system, display control device, and display control method | |
| JP2019512178A (ja) | 光レベル適応フィルタ及び方法 | |
| WO2013038355A1 (fr) | Visualisation 3d en temps réel par rayons x | |
| US12070182B2 (en) | Signal processing device, imaging device, and signal processing method | |
| US11523729B2 (en) | Surgical controlling device, control method, and surgical system | |
| JP7456385B2 (ja) | 画像処理装置、および画像処理方法、並びにプログラム | |
| WO2016194446A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, programme, et système d'imagerie in-vivo | |
| WO2016114155A1 (fr) | Dispositif de traitement d'image, procédé de traitement d'image, programme et système d'endoscope | |
| CN111465916B (zh) | 信息处理装置、信息处理方法以及程序 | |
| US11451698B2 (en) | Medical system and control unit | |
| US20210218873A1 (en) | Imaging device, gain setting method, and program | |
| WO2020195877A1 (fr) | Système médical, dispositif de traitement du signal et procédé de traitement du signal | |
| WO2020203405A1 (fr) | Système et méthode et dispositif d'observation médicale | |
| JP2005027359A (ja) | X線診断装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16885141 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2017561581 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16885141 Country of ref document: EP Kind code of ref document: A1 |