[go: up one dir, main page]

WO2020203034A1 - Système endoscopique - Google Patents

Système endoscopique Download PDF

Info

Publication number
WO2020203034A1
WO2020203034A1 PCT/JP2020/009616 JP2020009616W WO2020203034A1 WO 2020203034 A1 WO2020203034 A1 WO 2020203034A1 JP 2020009616 W JP2020009616 W JP 2020009616W WO 2020203034 A1 WO2020203034 A1 WO 2020203034A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
unit
information
image
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2020/009616
Other languages
English (en)
Japanese (ja)
Inventor
青野 進
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of WO2020203034A1 publication Critical patent/WO2020203034A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes

Definitions

  • the present invention presents various information obtained from endoscopic image data, each detection unit, each constituent unit, etc. as information when performing endoscopic observation or treatment under endoscopic observation. It's about the system.
  • an endoscope system including an endoscope having an elongated tube-shaped insertion portion is widely used in, for example, the medical field and the industrial field.
  • the medical endoscope system used in the medical field is, for example, by inserting an insertion part into the body cavity of a living body to observe the inside of an organ or the like, or by using a predetermined treatment tool as necessary. It is configured so that various treatments can be applied to the target organs and the like.
  • an insertion part is inserted inside a device such as a jet engine or a factory pipe or a mechanical device to prevent scratches or corrosion in the device or the mechanical device. It is configured so that the condition can be observed and inspected.
  • various information for example, shape, hue, saturation, brightness, frequency characteristics, distance, etc.
  • Information can be obtained.
  • various information during endoscopic observation for example, position information of the tip of the endoscope, time information, etc.
  • position information of the tip of the endoscope for example, position information of the tip of the endoscope, time information, etc.
  • time information for example, time information of the endoscope, time information, etc.
  • various information regarding the state of each constituent unit for example, the amount of emitted light of the light source device, the wavelength information, the output amount of the energy treatment device, the output time information, etc. is obtained. Can be obtained.
  • the endoscopic system disclosed by the above-mentioned republished patent WO2016 / 151888, etc. determines the degree of progress of the action by irradiating therapeutic light such as PDT (photodynamic therapy) with therapeutic light.
  • PDT photodynamic therapy
  • the endoscope system disclosed in Japanese Patent Publication No. 2005-237641 and the like is said to more appropriately maintain the lamp and the like that supply the illumination light of the light source device by detecting the amount of illumination light. It is a thing.
  • the system disclosed in the above-mentioned Japanese Patent Publication No. 5-285099 is an X-ray imaging system, which includes a detection device for detecting the residual X-ray dose of X-rays emitted from the X-ray irradiation device.
  • a detection device for detecting the residual X-ray dose of X-rays emitted from the X-ray irradiation device.
  • the endoscopic system disclosed by the above-mentioned republished patent WO2016 / 151888 uses the luminance value information among the information acquired from the endoscopic image data.
  • the endoscope system disclosed in Japanese Patent Publication No. 2005-237641 uses light quantity information as information acquired from the light source device.
  • the system disclosed in the Japanese Patent Publication No. 5-285099 and the like uses X-dose information as information acquired by the detection device.
  • the present invention has been made in view of the above points, and an object of the present invention is to display various information acquired during endoscopic observation or treatment under endoscopic observation using a display device. It is to provide an endoscopic system that can be presented in real time immediately and can perform efficient and reliable observation and treatment.
  • the endoscopic system of one aspect of the present invention acquires an imaging unit that forms an optical image of a subject to generate image data and time information corresponding to the image data.
  • An external device information acquisition unit that acquires output energy amount information and output time information output from an energy treatment device that is simultaneously used in an inspection using the time measurement unit and the imaging unit, and the time for the image data.
  • a data integration unit that outputs integrated data by associating information with the information acquired by the external device information acquisition unit, and an integrated data output from the data integration unit are integrated based on the time information. It is provided with a recording unit for recording as an endoscope image group.
  • various information acquired during endoscopic observation or treatment under endoscopic observation is immediately presented in real time using a display device to perform efficient and reliable observation and treatment. It is possible to provide an endoscopic system that enables it.
  • Block configuration diagram showing a main configuration in the endoscope system of the first embodiment of the present invention Block configuration diagram showing a main configuration in the endoscope system of the second embodiment of the present invention
  • FIG. 1 is a block configuration diagram showing a main configuration in the endoscope system according to the first embodiment of the present invention.
  • the endoscope system 1 of the first embodiment of the present invention includes an endoscope 10, a light source device 20, a processor 30, an analysis processing device 40, a monitor 50 as a display device, an energy treatment device 60, and the like. It is configured to have.
  • the endoscope 10 is configured to have an elongated tube-shaped insertion portion, and the insertion portion is inserted into, for example, a body cavity of a living body to observe and inspect the inside of an organ or the like, and if necessary, a predetermined one. It is a device configured to be able to perform various treatments on a target organ or the like using a treatment tool. Therefore, an imaging unit 11 is provided inside the tip of the insertion portion of the endoscope 10.
  • the image pickup unit 11 is composed of an image pickup element (imager) 11a, an image pickup optical system 11b, and the like.
  • the image pickup device (imager) 11a includes a photoelectric conversion element that outputs image data by receiving an optical image imaged by the image pickup optical system 11b and performing a photoelectric conversion process.
  • the imaging optical system 11b is composed of a plurality of or a single optical lens for forming an optical image of a subject.
  • the imaging unit 11 is a configuration unit that forms an optical image of a subject to generate and acquire image data, and the same one as that applied to the conventional endoscope 10 is applied. Therefore, the configuration of the imaging unit 11 itself is assumed to be the same as that of the conventional one, and detailed description thereof will be omitted.
  • a configuration example having only one imaging unit 11 is shown, but the present embodiment is not limited to this embodiment.
  • image data capable of forming a stereo image 3D image
  • distance information and the like can be acquired based on two or more image data acquired by two or more image pickup units 11.
  • a light guide fiber 12 is inserted and arranged in the insertion portion of the endoscope 10.
  • the light guide fiber 12 is provided between the illumination optical system 13 provided on the tip end surface of the insertion portion of the endoscope 10 and the connector portion (not shown) of the light source device 20.
  • the light guide fiber 12 serves to transmit the luminous flux emitted from the light source device 20 to the illumination optical system 13 on the front end surface of the insertion portion of the endoscope 10.
  • the light emitted from the illumination optical system 13 is directed toward the observation target site 101 of the subject 100 such as a patient, and illuminates the vicinity of the observation target site 101.
  • the light source device 20 is a device for supplying illumination light to the endoscope 10.
  • the light source device 20 is composed of, for example, a white light light source 21, an excitation light light source 22, a splitter 23, a condenser lens 24, and the like.
  • the white light light source 21 is, for example, a light source that emits white light. Specifically, for example, a light emitting diode (LED; light emission diode) or a xenon lamp (xenon lamp) is applied to the white light light source 21. When a light emitting diode is used, for example, it may be of a type in which B, G, and R are combined to generate white light.
  • the excitation light light source 22 is, for example, a light source that emits white light.
  • the splitter 23 is a configuration unit having a dichroic mirror surface having a function of reflecting 100% of wavelength light having a specific wavelength or higher and transmitting 100% of normal light.
  • the splitter 23 reflects, for example, excitation light (light of a specific wavelength) emitted from an excitation light light source 22 and light having a wavelength equal to or higher than the specific wavelength with a transmittance of approximately 100%, and white light.
  • the white light emitted from the light source 21 is transmitted with a transmittance of almost 100%. Then, the combined wave is emitted toward the condenser lens 24.
  • the condensing lens 24 is an optical lens that condenses the light from each light source (21, 22) and emits it toward the end surface of the light guide fiber 12 provided in the connector portion of the light source device 20.
  • the configuration of the light source device 20 itself, it is assumed that the same configuration as the conventional one is applied, and further description thereof will be omitted. Further, in the light source device 20 of the present embodiment, the white light light source 21 and the excitation light light source 22 are illustrated, but the type of the light source is not limited to these, and those that generate different types of light are used. It may be the one adopted.
  • the processor 30 receives an output signal from the imaging unit 11 of the endoscope 10 to generate image data, and acquires various information based on the output signal. Further, the processor 30 acquires detection log data (time series data) based on various acquired information and time information corresponding to each of the acquired information. Further, the processor 30 associates and synthesizes the image data and the detection log data. Then, the processor 30 performs display control for displaying an image based on the generated image log composite data (integrated data), information based on various information data, and the like in a predetermined area on the display screen of the monitor 50. ..
  • the processor 30 includes, for example, an image reading unit 31, a time measuring unit 31a, a normal optical image generation unit 32, a detection unit 33 (parameter detection unit), a detection log data acquisition unit 34, and an output unit 35. It includes an image / log data synthesis unit 36 (data integration unit, image composition unit), a display control unit 37, a recording unit 38, an energy output detection unit 39 (external device information acquisition unit), and the like. ..
  • the image reading unit 31 is a circuit or software program that reads an image signal output from the image sensor 11a of the image pickup unit 11.
  • the time measurement unit 31a is a circuit or software program that measures a predetermined time and outputs the acquired time information. Specifically, for example, the time measurement unit 31a measures the time from the start of reading the image signal by the image reading unit 31 and outputs the acquired time information to the detection log data acquisition unit 34. Further, the time measuring unit 31a measures the energy output time by the energy output unit 61 and outputs the acquired time information to the energy output detecting unit 39.
  • the time measurement unit 31a corresponds to, for example, an internal clock circuit called a real-time clock (RTC) or the like.
  • the time information obtained by the time measuring unit 31a is used as time information related to, for example, date and time information associated with the image data, various numerical data detected by the detection unit 33, detection log data, and the like.
  • the normal optical image generation unit 32 is a circuit or software program that generates image data by receiving an image signal output from the image sensor 11a of the image pickup unit 11.
  • the image data generated by the normal light image generation unit 32 is image data generated based on an image signal acquired when the subject is illuminated with normal light (white light).
  • the detection unit 33 is a parameter detection unit including a circuit or a software program for detecting predetermined numerical data based on an image signal or the like read by the image reading unit 31.
  • the predetermined numerical data detected by the detection unit 33 includes, for example, shape information on the image, hue / saturation / brightness information in the image, frequency characteristics, brightness level, specific wavelength on the image, and the light source unit. There is light quantity information, wavelength information of the light emitted by the light source, and so on.
  • the detection unit 33 is not limited to numerical data based on image signals, but also has various other sensors such as a gyro sensor, GPS (Global Positioning System), temperature sensor, pressure sensor, acceleration sensor, and the like. Detects numerical data based on output signals from sensors.
  • one imaging unit 11 is provided, but in addition to this configuration, for example, an endoscope 10 having two or more imaging units 11 can be considered. With this configuration, it is also possible to acquire numerical data related to distance information as information data acquired based on each image data output from two or more imaging units 11.
  • the detection log data acquisition unit 34 is a circuit or software program that acquires the numerical data acquired by the detection unit 33 as log data. Therefore, the detection log data acquisition unit 34 acquires the time information corresponding to each numerical data acquired by the detection unit 33 from the time measurement unit 31a.
  • the output unit 35 transfers the detection log data acquired by the detection log data acquisition unit 34 and the output data (image log synthesis data, integrated data) from the image / log data synthesis unit 36 to the external analysis processing device 40. It is a circuit or software program that outputs.
  • the image / log data synthesis unit 36 performs a synthesis process for associating the image data generated by the normal optical image generation unit 32 with the detection log data acquired by the detection log data acquisition unit 34, and performs image log synthesis data (integration). It is a data integration unit consisting of a circuit or software program that generates and outputs data).
  • the image / log data synthesizing unit 36 is also an image synthesizing unit that performs an image synthesizing process for displaying the images and information included in the image log synthesizing data (integrated data) on the display screen.
  • the display control unit 37 is a circuit or software program that controls display when displaying an image, information, or the like in a predetermined form on the display screen of the monitor 50.
  • the display control process performed by the display control unit 37 is, for example, displaying an image based on the image data generated by the normal optical image generation unit 32 in a predetermined area on the display screen of the monitor 50, or acquiring detection log data.
  • Display control in various display forms, such as displaying information based on the detection log data acquired by the unit 34 in a predetermined area on the display screen of the monitor 50, and superimposing an image display and an information display. Is included.
  • the recording unit 38 includes a recording medium such as a semiconductor memory for recording each output data from the image / log data synthesis unit 36 and the detection log data acquisition unit 34, and a circuit or software program for driving the recording medium. It is a unit.
  • the recording unit 38 combines the image log composite data (integrated data) output from the image / log data synthesizer 36 (data integration unit) into one integrated endoscopic image group (for example, based on time information).
  • a plurality of still image data are recorded as a set of moving image data in a form collected in chronological order).
  • the present invention is not limited to this embodiment, and the recording unit 38 may be provided outside.
  • the recording unit 43 of the analysis processing device 40 which is an external device, it is possible to omit the recording unit 38 in the processor 30.
  • the energy output detection unit 39 receives various information data output from the energy output unit 61 of the energy treatment device 60, time information from the time measurement unit 31a, and the like, and receives various information regarding output energy (output amount, output intensity, etc.). It is an external device information acquisition unit consisting of a circuit or software program that detects (output time, etc.) and generates log data.
  • the processor 30 has various constituent units other than those described above, but the constituent units other than those described above are not directly related to the present invention, and therefore detailed description thereof will be omitted. To do.
  • the analysis processing device 40 receives output data (detection log data, image log synthesis data, etc.) from the output unit 35 of the processor 30 and records the data, performs a predetermined analysis and determination, and obtains the analysis and determination results. Based on this, it is a circuit or software program that generates and generates a control signal of the energy treatment device 60.
  • the analysis processing device 40 includes a detection data analysis determination unit 41 (data analysis unit), a detection data determination reference input unit 42, a recording unit 43, an energy treatment device control unit 44, and the like.
  • the detection data analysis judgment unit 41 is a data analysis consisting of a circuit or a software program that receives output data (detection log data, image log synthesis data, etc.) from the output unit 35 of the processor 30 and performs predetermined analysis processing and judgment processing. It is a department.
  • various detection data input from the output unit 35 has a constant numerical value defined with respect to the detection data determination reference value input from the detection data determination reference input unit 42. It performs analysis processing and determination processing such as whether it is within the range, is equal to or more than the specified numerical value, or is less than or equal to the specified numerical value.
  • the analysis determination result data of the detection data analysis determination unit 41 is sent to the energy treatment device control unit 44 in addition to the monitor 50 and the recording unit 43.
  • the detection data determination standard input unit 42 has a plurality of data such as a determination reference value preset for the detection data, and outputs a predetermined detection data determination reference value or the like to the detection data analysis determination unit 41 at a predetermined timing. It is a circuit or software program to be used.
  • the data such as the detection data determination reference value is reference data used in the analysis process and the determination process in the detection data analysis determination unit 41.
  • the recording unit 43 receives and records output data (detection log data, image log composite data, etc.) from the output unit 35 of the processor 30, information data related to the analysis determination result output from the detection data analysis determination unit 41, and the like.
  • a recording medium such as a semiconductor memory and a circuit or software program that drives the recording medium.
  • the energy treatment device control unit 44 is a circuit or software program for controlling the energy treatment device 60 based on the analysis determination result data by the detection data analysis determination unit 41.
  • the analysis processing apparatus 40 has various constituent units other than those described above, but the constituent units other than those described above are not directly related to the present invention, and therefore detailed description thereof will be omitted. To do.
  • the monitor 50 is controlled by the display control unit 37 of the processor 30, and based on the input image data and various information data, the image and various information are visually recognized on the display screen of the monitor 50 in an appropriate predetermined form. It is a display device that can display.
  • the monitor 50 includes, for example, a display panel such as a liquid crystal display (LCD) and an organic electro-Luminescence display (OEL), a drive circuit thereof, a software program, and the like.
  • LCD liquid crystal display
  • OEL organic electro-Luminescence display
  • the energy treatment device 60 is a treatment device used when performing a predetermined treatment under endoscopic observation.
  • the energy treatment device 60 is a treatment device that outputs energy (laser light or the like) to treat a predetermined affected portion of a subject (patient). Therefore, the energy treatment device 60 is configured to include an energy output unit 61 and the like.
  • the energy output unit 61 is a mechanism, circuit, or software program for outputting energy (laser light, etc.) for treatment.
  • the energy output unit 61 is controlled by the energy treatment device control unit 44 of the analysis processing device 40.
  • the energy treatment device 60 also has various constituent units other than those described above. However, since the constituent units other than those described above are not directly related to the present invention, detailed description thereof will be omitted.
  • the actions described here include, for example, observing the observation target site 101 in the body cavity of the subject 100 using an endoscope 10 and performing a predetermined treatment using the energy treatment device 60 under the endoscopic observation. A case is assumed.
  • the user inserts the insertion portion of the endoscope 10 into the body cavity of the subject 100, and is provided at the tip of the insertion portion of the endoscope 10 in the vicinity of the desired observation target site 101.
  • the imaging unit 11 is arranged. During that time, the imaging operation by the imaging unit 11 of the endoscope 10 is continuously performed. At this time, the image signal acquired by the imaging unit 11 is read by the image reading unit 31 of the processor 30 and transmitted to the normal optical image generation unit 32 and the detection unit 33.
  • the normal optical image generation unit 32 generates predetermined image data.
  • the detection unit 33 detects various information data as numerical data. Further, the time measuring unit 31a detects the time information for each image signal acquired by the image reading unit 31.
  • the information data detected by the detection unit 33 for example, there is information about the shape on the image.
  • shape information for example, blood vessels, ureters, nerves, etc. on an image can be detected and recognized.
  • the information data detected by the detection unit 33 for example, there is information on hue, saturation, brightness, etc. on the image. With this information, for example, bleeding points and blood flow on an image can be detected and recognized.
  • the information data detected by the detection unit 33 for example, there is information regarding frequency characteristics on the image.
  • this information for example, a shape on an image, specifically, for example, an organ or a polyp can be detected and recognized.
  • the information data detected by the detection unit 33 for example, there is information regarding the brightness level and the like. With this information, for example, it is possible to detect and recognize objects such as near-distance detection on an image, detection inside or outside the body, and gauze.
  • the information data detected by the detection unit 33 there is information obtained by receiving outputs from various sensors (not shown). Examples of these various sensors include a gyro sensor provided at the tip of the endoscope 10 for acquiring and clearly indicating the position of the tip of the endoscope 10.
  • the detection unit 33 detects the tip position of the endoscope 10 based on the output information from the gyro sensor.
  • the position information data as the information data detected by the detection unit 33 in this way is transmitted to the detection log data acquisition unit 34.
  • distance information is used as information data acquired based on each image data output from the two or more imaging units 11. You can also get numerical data about.
  • the information data related to this distance information is numerical data such as the distance from the tip surface of the endoscope 10 to the observation target portion 101.
  • the image data generated by the normal optical image generation unit 32 is transmitted to the image / log data synthesis unit 36.
  • various information data (numerical data) detected by the detection unit 33 is transmitted to the detection log data acquisition unit 34.
  • the detection log data acquisition unit 34 acquires detection log data based on various information data detected by the detection unit 33 and time information from the time measurement unit 31a.
  • the detection log data acquired here is transmitted to the image / log data synthesis unit 36, the output unit 35, and the recording unit 38.
  • the image / log data synthesis unit 36 generates image log synthesis data by associating the image data generated by the normal optical image generation unit 32 with the detection log data acquired by the detection log data acquisition unit 34.
  • the image log composite data generated here is transmitted to the output unit 35 and the recording unit 38.
  • the output unit 35 outputs the detection log data, the image log composite data, and the like to the external analysis processing device 40.
  • the data such as the detection log data and the image log synthesis data are recorded in the recording unit 43 of the analysis processing device 40.
  • the detection data analysis determination unit 41 of the analysis processing device 40 executes a predetermined analysis process and determination process based on the detection data determination reference value from the detection data determination reference input unit 42 for the detection log data from the output unit 35. To do. That is, does the detection data analysis determination unit 41 have the detection log data input from the output unit 35 within a predetermined numerical range with respect to the detection data determination reference value input from the detection data determination reference input unit 42? , Analyze and judge whether it is out of the specified numerical range.
  • the analysis determination result data of the detection data analysis determination unit 41 is transmitted to the recording unit 43 and recorded on the recording medium of the recording unit 43. Further, the analysis determination result data of the detection data analysis determination unit 41 is also transmitted to the energy treatment device control unit 44.
  • the energy output unit 61 transmits information data (energy output information data) regarding the energy treatment output to the energy output detection unit 39 of the processor 30.
  • the energy output information data is numerical data such as an output energy amount and an output energy fluctuation amount.
  • the time measurement unit 31a measures the time based on the energy treatment output information data from the energy output unit 61, and transmits the measurement result to the energy output detection unit 39 as time information regarding the energy treatment output.
  • the energy output detection unit 39 acquires information data such as the output time of the energy treatment output based on the energy treatment output information data from the energy output unit 61 and the time information from the time measurement unit 31a. This information data is transmitted to the detection log data acquisition unit 34.
  • the detection log data acquisition unit 34 generates log data (energy output log data) related to energy output based on the energy output information data from the energy output detection unit 39 and the time information from the time measurement unit 31a.
  • the energy output log data generated by the detection log data acquisition unit 34 is transmitted to the analysis processing device 40 via the output unit 35.
  • the analysis determination result data of the detection data analysis determination unit 41 is transmitted to the energy treatment device control unit 44.
  • the energy treatment device control unit 44 controls the energy output value from the energy output unit 61 so as to be within a predetermined numerical range based on the analysis determination result data.
  • the analysis determination result data of the detection data analysis determination unit 41 is also transmitted to the monitor 50 as described above.
  • output data (detection log data, image log synthesis data, energy output log data, etc.) from the output unit 35 of the processor 30 is also transmitted to the monitor 50.
  • the monitor 50 is controlled by the display control unit 37 to display the output data from the output unit 35.
  • the display control unit 37 controls, for example, to display the detection log data in a form of being superimposed on the displayed image based on the image data.
  • the display control unit 37 controls to display, for example, an endoscopic image based on image data (image log composite data) in a predetermined area (area indicated by reference numeral 50a in FIG. 1) in the display screen of the monitor 50. Do.
  • the display control unit 37 provides information based on, for example, detection log data, energy output log data, etc. (for example, character information) in a predetermined area (area indicated by reference numeral 50b in FIG. 1) in the display screen of the monitor 50. Controls the display on.
  • the display control unit 37 controls to display information based on, for example, analysis determination result data (for example, character information) in a predetermined area (area indicated by reference numeral 50c in FIG. 1) in the display screen of the monitor 50. Do.
  • various information according to the analysis judgment result for example, assist information regarding the treatment, warning notification, etc. may be displayed.
  • the time information related to each information data is to detect and analyze the time required for surgery using the treatment device (treatment time required for each procedure step, total time for the entire surgery, etc.). Can be done.
  • the order of the procedure steps is detected, and the step transition time is also referred to to check whether the transition between the procedure steps can be smoothly executed. Can be analyzed and judged.
  • the cooperation between the surgeon and the assistant can be determined.
  • the operator's Konko and treatment operations are detected, the tip position of the endoscope 10 in the body cavity is detected, and each operation timing, time, and positional relationship performed by the operator and the scorpist are detected. Therefore, it is possible to determine the cooperation between the surgeon and the scorpist.
  • the position of each trocar can be detected.
  • the image based on the image data is displayed in the predetermined area 50a of the monitor 50. Therefore, it is possible to detect whether the object to be observed or the desired portion is clearly captured only by checking the display screen of the monitor 50. For example, it is possible to determine whether or not an observation target object or a desired part (for example, a target organ, the desired part thereof, etc.) is displayed by an image.
  • the tip of the Konko and its position can be confirmed by the display. It is desirable that the object to be observed and the desired portion are, for example, near the center of the display area 50a of the monitor 50.
  • the bleeding site, organ damage site, etc. can be detected from the display image of the monitor 50.
  • the forceps and retractors held by the assistant are detected and their positions are detected in, for example, the peripheral area of the display area 50a of the monitor 50, it can be determined that a wide surgical field is secured.
  • a wide surgical field can be determined.
  • image data that is continuously acquired and log data (time-series data) associated with time information can be acquired, for example, by tracking changes in the position of Konko in time series, the operation time of Konko can be obtained. And the operation method can be detected. This makes it possible to determine whether or not the correct operation has been performed. In addition, it is possible to detect whether the correct treatment is performed by detecting the contact time between the Konko and the organ tissue and the change in the forceps position.
  • the type of treatment tool or the like can be detected. Therefore, it is possible to determine whether or not an appropriate treatment tool is selected. At the same time, for example, by detecting the energy output time, output timing, etc., it is possible to determine whether or not proper use is being performed.
  • information such as the shape, color, tissue running, and layer type of the peeling layer can be acquired from the image data, so that the peeling layer can be detected.
  • the blood vessel itself can be detected from image data, information of the light source device 20, etc., and the blood vessel clip can be detected, its position can be detected, and the arrangement time of the blood vessel clip can be detected. it can. Furthermore, the treatment time and treatment position for the blood vessel dissection treatment can be detected.
  • the suture time can be detected by the position and movement of the needle or thread.
  • the suturing time can be detected by detecting the positional relationship between the needle and the organ tissue and its change.
  • various information data acquired from the endoscopic image data and various sensors are acquired as numerical data, and the various information data correspond to each other.
  • it is configured to be sequentially output and recorded as predetermined log data (time series data).
  • the above-mentioned predetermined log data is analyzed and determined whether it is within the specified numerical range or outside the specified numerical range with respect to the predetermined reference value specified in advance.
  • the analysis determination result is recorded and displayed as information associated with the endoscopic image displayed on the monitor 50.
  • the endoscopic system 1 of the present embodiment various information when performing endoscopic observation or performing treatment under endoscopic observation is predetermined together with the corresponding endoscopic image. It can be displayed in the display form of. As a result, the user (user) can acquire the corresponding related information data in real time while displaying the observation image. Therefore, the user (user) can efficiently and reliably perform endoscopic observation and treatment under endoscopic observation.
  • FIG. 2 is a block configuration diagram showing a main configuration in the endoscope system according to the second embodiment of the present invention.
  • the endoscope system 1A of the present embodiment basically has substantially the same configuration as the endoscope system 1 of the first embodiment described above.
  • information data from the light source device 20 is acquired as numerical data and associated with time information.
  • time series data In addition to outputting the log data (time series data), whether the log data is within the specified numerical range or outside the specified numerical range with respect to the predetermined reference value specified in advance, etc. The difference is that the analysis and judgment of the above are performed.
  • the endoscope system 1A of the present embodiment includes an emission light amount detection unit 39A (external device information acquisition unit) in the processor 30. It includes a light source control unit 45 in the analysis processing device 40.
  • the emitted light amount detecting unit 39A of the processor 30 receives various information data output from each light source (white light light source 21, excitation light light source 22) of the light source device 20, time information from the time measuring unit 31a, and outputs the light source. It is an external device information acquisition unit consisting of a circuit or software program that detects various information related to the light source and generates log data. Therefore, the time measuring unit 31a further measures the light amount output time of the light source device 20 and outputs the acquired time information to the emitted light amount detecting unit 39A.
  • the information data acquired by the emitted light amount detecting unit 39A includes, for example, emitted light amount information, emitted light intensity information, emitted light output time, etc. output from each light source (white light light source 21, excitation light light source 22) of the light source device 20. It is numerical data.
  • the information data can be used to detect whether the tip of the endoscope 10 is inside or outside the body cavity.
  • the information data can be used to determine an imaged object, for example, to detect whether or not there is gauze.
  • the information data acquired by the emitted light amount detecting unit 39A also includes, for example, a specific wavelength (narrow band wavelength) of the emitted light output from each light source (white light light source 21, excitation light light source 22) of the light source device 20.
  • a specific wavelength narrow band wavelength
  • the information data acquired by the emitted light amount detection unit 39A is transmitted to the detection log data acquisition unit 34. Then, in the detection log data acquisition unit 34, log data (emission light log data) related to the emission light is generated based on the information data from the emission light amount detection unit 39A and the time information from the time measurement unit 31a. ..
  • the emitted light log data generated by the detection log data acquisition unit 34 is transmitted to the analysis processing device 40 via the output unit 35.
  • the light source control unit 45 of the analysis processing device 40 is a circuit or software program for controlling the light source device 20 based on the analysis determination result data by the detection data analysis determination unit 41. Therefore, the analysis determination result data of the detection data analysis determination unit 41 is also sent to the light source control unit 45.
  • the light source control unit 45 controls the light amount value from the light source device 20 to be within a predetermined numerical range based on the analysis determination result data of the detection data analysis determination unit 41.
  • the light source device 20 can be appropriately controlled by acquiring information data from the light source device 20 in addition to the energy treatment device 60. it can.
  • the present invention is not limited to the above-described embodiment, and it goes without saying that various modifications and applications can be carried out within a range that does not deviate from the gist of the invention.
  • the above-described embodiment includes inventions at various stages, and various inventions can be extracted by an appropriate combination of a plurality of disclosed constituent requirements. For example, even if some constituent requirements are deleted from all the constituent requirements shown in the above embodiment, if the problem to be solved by the invention can be solved and the effect of the invention is obtained, this constituent requirement is deleted.
  • the configured configuration can be extracted as an invention.
  • components across different embodiments may be combined as appropriate.
  • the present invention is not limited by any particular embodiment thereof except as limited by the accompanying claims.
  • the present invention can be applied not only to an endoscope control device in the medical field but also to an endoscope control device in the industrial field.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

La présente invention a pour but de fournir un système endoscopique qui permet une observation et un traitement efficaces et garantis par présentation rapide d'une variété d'informations acquises pendant une observation endoscopique ou pendant un traitement sous observation endoscopique. À cet effet, ce système endoscopique comporte une unité d'imagerie (11) qui génère des données d'image par formation d'une image optique d'un objet d'essai ; une unité de mesure de synchronisation (31a) qui acquiert des informations de synchronisation correspondant aux données d'image ; une unité d'acquisition d'informations d'équipement externe (39) qui acquiert des informations de quantité d'énergie de sortie et des informations de synchronisation de sortie qui sont délivrées par un appareil de traitement d'énergie (60) utilisé de manière simultanée dans un contrôle dans lequel l'unité d'imagerie est utilisée ; une unité d'intégration de données (36) qui délivre des données intégrées en associant les données d'image aux informations de synchronisation et les informations acquises par l'unité d'acquisition d'informations d'équipement externe ; et une unité d'enregistrement (38, 43) qui, sur la base des informations de synchronisation, enregistre, sous la forme d'un groupe d'images endoscopiques intégré, les données intégrées délivrées par l'unité d'intégration de données.
PCT/JP2020/009616 2019-04-03 2020-03-06 Système endoscopique Ceased WO2020203034A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019071581A JP2020168208A (ja) 2019-04-03 2019-04-03 内視鏡システム
JP2019-071581 2019-04-03

Publications (1)

Publication Number Publication Date
WO2020203034A1 true WO2020203034A1 (fr) 2020-10-08

Family

ID=72668700

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/009616 Ceased WO2020203034A1 (fr) 2019-04-03 2020-03-06 Système endoscopique

Country Status (2)

Country Link
JP (1) JP2020168208A (fr)
WO (1) WO2020203034A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005287832A (ja) * 2004-03-31 2005-10-20 Olympus Corp 加熱治療装置
JP2006230490A (ja) * 2005-02-22 2006-09-07 Olympus Medical Systems Corp 検体検査用器具および検査システム
WO2011004801A1 (fr) * 2009-07-06 2011-01-13 富士フイルム株式会社 Dispositif d'éclairage pour endoscope et dispositif d'endoscope
JP2017513645A (ja) * 2014-04-28 2017-06-01 カーディオフォーカス,インコーポレーテッド アブレーション処置の際にicg色素組成物を用いて組織を視覚化するためのシステムおよび方法
JP2018504154A (ja) * 2014-12-03 2018-02-15 カーディオフォーカス,インコーポレーテッド アブレーション処置中の肺静脈隔離の目視確認のためのシステム及び方法
WO2018216276A1 (fr) * 2017-05-22 2018-11-29 ソニー株式会社 Système d'observation et appareil de commande de sources de lumière

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005287832A (ja) * 2004-03-31 2005-10-20 Olympus Corp 加熱治療装置
JP2006230490A (ja) * 2005-02-22 2006-09-07 Olympus Medical Systems Corp 検体検査用器具および検査システム
WO2011004801A1 (fr) * 2009-07-06 2011-01-13 富士フイルム株式会社 Dispositif d'éclairage pour endoscope et dispositif d'endoscope
JP2017513645A (ja) * 2014-04-28 2017-06-01 カーディオフォーカス,インコーポレーテッド アブレーション処置の際にicg色素組成物を用いて組織を視覚化するためのシステムおよび方法
JP2018504154A (ja) * 2014-12-03 2018-02-15 カーディオフォーカス,インコーポレーテッド アブレーション処置中の肺静脈隔離の目視確認のためのシステム及び方法
WO2018216276A1 (fr) * 2017-05-22 2018-11-29 ソニー株式会社 Système d'observation et appareil de commande de sources de lumière

Also Published As

Publication number Publication date
JP2020168208A (ja) 2020-10-15

Similar Documents

Publication Publication Date Title
JP6905274B2 (ja) 組織酸素化のマッピングのための装置、システム、および方法
US20250281033A1 (en) Heat invasion observation apparatus, endoscope system, heat invasion observation system, and heat invasion observation method
JP5810248B2 (ja) 内視鏡システム
KR101647022B1 (ko) 의료 영상 획득 장치 및 방법
JP5642619B2 (ja) 医療装置システム及び医療装置システムの作動方法
JP5492030B2 (ja) 画像撮像表示装置およびその作動方法
CN105992546B (zh) 内窥镜系统
JP4190917B2 (ja) 内視鏡装置
JP2001299676A (ja) センチネルリンパ節検出方法および検出システム
JP2013245980A (ja) 撮像装置および撮像方法
US10413619B2 (en) Imaging device
JPWO2012157338A1 (ja) 医療機器、医療用プロセッサの作動方法及び医療用プロセッサ
US20230248209A1 (en) Assistant device, endoscopic system, assistant method, and computer-readable recording medium
US20100076304A1 (en) Invisible light irradiation apparatus and method for controlling invisible light irradiation apparatus
JP2022179746A (ja) 医療用制御装置、医療用観察システム、制御装置及び観察システム
WO2014155783A1 (fr) Système endoscopique
US12198235B2 (en) Endoscope system, medical image processing device, and operation method therefor
KR101594523B1 (ko) 가시광 광학영상 및 비가시광 형광영상의 동시구현이 가능한 광대역 영상 획득투사장치
CN113645889A (zh) 用于将医学成像设备连接到医学成像控制器的系统和方法
WO2020203034A1 (fr) Système endoscopique
JPS63252134A (ja) 螢光検出を利用したがん診断装置
JP2008043383A (ja) 蛍光観察内視鏡装置
RU203175U1 (ru) Видеофлуоресцентное устройство для анализа внутритканевого распределения фотосенсибилизаторов дальнего красного и ближнего инфракрасного диапазонов злокачественных новообразований головы и шеи
KR102311982B1 (ko) 왕복형 필터를 구비하는 내시경 장치
JP2013094173A (ja) 観察システム、マーキング装置、観察装置及び内視鏡診断システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20782539

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20782539

Country of ref document: EP

Kind code of ref document: A1