[go: up one dir, main page]

WO2019207740A1 - Système d'aide au déplacement et procédé d'aide au déplacement - Google Patents

Système d'aide au déplacement et procédé d'aide au déplacement Download PDF

Info

Publication number
WO2019207740A1
WO2019207740A1 PCT/JP2018/017097 JP2018017097W WO2019207740A1 WO 2019207740 A1 WO2019207740 A1 WO 2019207740A1 JP 2018017097 W JP2018017097 W JP 2018017097W WO 2019207740 A1 WO2019207740 A1 WO 2019207740A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
image
captured image
unit
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/017097
Other languages
English (en)
Japanese (ja)
Inventor
良 東條
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Priority to PCT/JP2018/017097 priority Critical patent/WO2019207740A1/fr
Priority to JP2020515412A priority patent/JP7093833B2/ja
Publication of WO2019207740A1 publication Critical patent/WO2019207740A1/fr
Priority to US17/079,610 priority patent/US11812925B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000096Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/0002Operational features of endoscopes provided with data storages
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/005Flexible endoscopes
    • A61B1/0051Flexible endoscopes with controlled bending of insertion part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present invention relates to a movement support system and a movement support method that facilitate insertion of an endoscope insertion portion and the like.
  • endoscopes in which elongated endoscopes are inserted into body cavities and the like to observe a test site and perform various treatments have been widely used. Also in the industrial field, industrial endoscopes that can observe and inspect internal scratches and corrosion of boilers, turbines, engines, chemical plants, and the like are widely used.
  • the endoscopic image obtained by the endoscope image sensor is transmitted to a processor that performs signal processing.
  • the processor processes the image from the endoscope and supplies it to the monitor for display or to the recording device for recording.
  • the insertion part of the endoscope is inserted into the lumen or the like.
  • the insertion portion is smoothly inserted into the lumen or the like. Therefore, in International Publication No. 2008-155828, there is a technique for detecting the position of the dark part as a target of the insertion direction and indicating the operation direction based on the past dark part position detection history when the dark part is lost. It is disclosed.
  • the movement support system is provided with a captured image acquired by an imaging unit fixed to a moving body, detects an image of an imaging target included in the captured image, and detects the detected imaging target.
  • a target image detection unit that detects a position of the image in the captured image
  • a position storage unit that stores a position in the captured image of the image of the imaging target detected by the target image detection unit, and the captured image
  • a target position change estimation unit that estimates a change in the position of the imaging target with reference to the captured image by detecting a change in the captured image, and the imaging from above the captured image by the target image detection unit.
  • the position information stored in the position storage unit and the position of the imaging target obtained by the target position change estimation unit Based on the estimation result of the change, including a, and the target position estimating section for outputting a position estimation result to estimate the position of the imaging target relative to the said captured image.
  • a captured image acquired by an imaging unit fixed to a moving body is given, and an image of the imaging target included in the captured image is detected.
  • the position stored in the position storage unit is stored.
  • the position of the imaging target is estimated based on the captured image and the position estimation result is output.
  • the block diagram which shows the movement assistance system which concerns on the 1st Embodiment of this invention.
  • Explanatory drawing for demonstrating the target position estimation method by an optical flow Explanatory drawing for demonstrating the target position estimation method by an optical flow.
  • Explanatory drawing for demonstrating the target position estimation method by an optical flow Explanatory drawing which shows the example from which the movement of a tracking point becomes irregular.
  • movement of 1st Embodiment Explanatory drawing for demonstrating operation
  • region set in a captured image Explanatory drawing which shows the example of a display when an imaging target image lose
  • FIG. 1 is a block diagram showing a movement support system according to the first embodiment of the present invention.
  • the first embodiment is applied to insertion support when an insertion portion of an endoscope is inserted into a subject.
  • the present embodiment effectively supports movement of a moving body by providing or controlling a direction in which the moving body should be moved in an operation or control for moving the moving body provided with an imaging unit.
  • a moving body for example, not only the insertion part of medical and industrial endoscopes but also a capsule endoscope, a catheter provided with an imaging part, and the like can be adopted as the moving body.
  • you may employ adopt the well-known various apparatuses which can move autonomously as a moving body.
  • a robot cleaner that can move indoors, an automatic traveling vehicle that moves on the ground, an unmanned aircraft such as a drone, or an autonomous ship that moves on the water may be used.
  • the movement support system 1 includes an endoscope 2, a movement support system main body 10, and a display device 21.
  • the endoscope 2 has an elongated insertion portion 3, and an operation portion 4 that is operated by an operator is connected to a proximal end portion of the insertion portion 3.
  • a distal end rigid portion 5 is provided at the distal end of the insertion portion 3, and a bending portion 6 composed of a plurality of bending pieces is provided at the rear end.
  • the bending portion 6 can be bent by operating a bending operation knob (not shown) disposed in the operation portion 4.
  • An imaging unit 7 having an imaging device composed of a CCD, a CMOS sensor, or the like is disposed at the distal end rigid portion 5 of the insertion unit 3.
  • the insertion section 3 is provided with a light guide (not shown) that guides illumination light. Illumination light from a light source (not shown) is guided by the light guide and irradiated from the rigid end 5 of the insertion section 3 to the subject. It has come to be.
  • the imaging unit 7 obtains an imaging signal based on the subject optical image by photoelectric conversion.
  • the imaging unit 7 transmits an imaging signal to the movement support system main body 10 via the insertion unit 3 and the signal line 8 in the operation unit 4.
  • a direction in which the distal rigid portion 5 should move (hereinafter referred to as a movement target direction) is formed by a tubular organ such as the intestinal tract at the time of examination or the like. Is the luminal direction.
  • the imaging direction of the imaging unit 7 is set, for example, to be the same direction as the axial direction of the distal rigid portion 5. Therefore, if the captured image obtained by the imaging unit 7 includes an image (hereinafter referred to as an imaging target image) of a target (hereinafter referred to as an imaging target image) existing in the movement target direction, that is, an image of a lumen. It may be considered that the distal end rigid portion 5 is directed toward the movement target direction.
  • the insertion portion 3 when the insertion portion 3 is inserted, if the surgeon performs an insertion operation so that the image portion of the lumen is included in the captured image, the insertion portion 3 is surely advanced in the depth direction of the lumen. Become. However, when the image of the lumen once disappears from the captured image, the insertion unit 3 may rotate around the axis, and it is not easy to recognize the lumen direction from the captured image. Thus, in the present embodiment, by providing the captured image to the movement support system body 10, it is possible to present the movement target direction in the movement support system body 10.
  • the movement support system main body 10 is provided with a control unit 11.
  • the control unit 11 may be configured by a processor using a CPU or the like, and may operate in accordance with a program stored in a memory (not shown) to control each unit, or may be a part of a function by a hardware electronic circuit. Or you may implement
  • the control unit 11 controls each unit of the movement support system main body 10.
  • the image processing unit 12 of the movement support system main body 10 performs predetermined signal processing on the imaging signal transmitted via the signal line 8 to obtain a captured image based on the imaging signal.
  • the image processing unit 12 outputs the captured image obtained by the imaging unit 7 to the target image detection unit 13 and the target position change estimation unit 14.
  • the target image detection unit 13 detects a captured target image in the captured image.
  • the target image detection unit 13 uses a deep learning for a captured target image, for example, an R-CNN (Regions with CNN features) using CNN (Convolution Neural Network), FCN (Fully Convolutional Networks), or the like. You may detect the inside imaging target image.
  • the target image detection unit 13 may detect the imaging target image by a dark part in the captured image when the imaging target is a lumen. For example, if there is a continuous region having a predetermined size with a luminance value equal to or less than a predetermined value, the target image detection unit 13 may determine the region as a lumen image. The target image detection unit 13 may detect and track the imaging target image based on the feature amount of the imaging target image when the shape of the imaging target image is a known shape. .
  • the target image detection unit 13 detects a captured target image in the captured image and obtains a position of the detected captured target image on the captured image.
  • the target image detection unit 13 is used when the imaging target deviates from the imaging range and is positioned outside the imaging range, that is, when the detected imaging target image no longer exists (disappears) in the captured image.
  • the information on the position on the image of the target image that was present in the last captured image, that is, the position at the start of disappearance is output to the end position storage unit 15. Further, the target image detection unit 13 outputs erasure start information to the target position change estimation unit 14 at the start of erasure of the captured target image.
  • the end position storage unit 15 is configured by a predetermined storage medium, and stores information from the target image detection unit 13.
  • the target image detection unit 13 may output position information on the image to the end position storage unit 15 for each detection when a captured target image is detected in the captured image. Also in this case, the end position storage unit 15 stores information on the position on the image last detected immediately before the disappearance of the imaging target image.
  • the target position change estimation unit 14 is provided with a captured image from the image processing unit 12, and based on the change in the captured image, the relative position (from the disappearance start point of the captured target image to the distal rigid portion 5 of the imaging target ( Hereinafter, a change in the target position) is detected. Since the imaging unit 7 is fixed to the distal rigid portion 5, the relative position of the imaging target with respect to the distal rigid portion 5 is based on the imaging range of the imaging unit 7, that is, based on the captured image. It may be said that it is the target position of the imaging target.
  • the distal rigid portion 5 is operated by the operator inserting or removing the insertion portion 3 into, for example, a lumen of the subject's intestinal tract, twisting around the axis of the insertion portion 3, or bending the bending portion 6. Moves in a direction orthogonal to the axial direction, a direction that advances and retreats in the axial direction, and a direction that rotates about the axis.
  • the imaging unit 7 is fixedly arranged with respect to the distal end rigid portion 5, and the imaging direction (imaging range) of the imaging unit 7 changes as the distal end rigid portion 5 moves. Therefore, by observing a change in the imaging range, that is, a change in the captured image, it is possible to obtain the movement amount and movement direction of the distal end rigid portion 5, that is, the change in the target position.
  • the shape of an organ that is an observation site may change.
  • the position of the lumen at the position of the distal rigid portion 5 and the position of the distal rigid portion 5 may change relatively due to the displacement of the observation site.
  • the target position change estimation unit 14 is provided with an image change detection unit 14a, and the image change detection unit 14a detects a change in the image of the captured image that is sequentially input.
  • the target position change estimation unit 14 estimates the relative position of the imaging target with reference to the distal end rigid portion 5, that is, the change in the target position, based on the change in the captured image detected by the image change detection unit 14a.
  • the target position change estimation unit 14 can employ, for example, optical flow estimation or deep learning by known image processing as a method for obtaining a change in the target position by a change in the captured image.
  • FIGS. 2 to 4 are explanatory diagrams for explaining a method for estimating a change in target position by optical flow estimation.
  • FIGS. 2 to 4 show image portions of objects (hereinafter also referred to as tracking points) included in the captured image by circles, and arrows between the circles indicate that the objects move. That is, the length of the arrow corresponds to the movement amount, and the direction of the arrow corresponds to the movement direction.
  • the image of the object may be one pixel or a predetermined area composed of a plurality of pixels. 2 to 4, the same pattern circles indicate the same object.
  • FIGS. 2 to 4 show examples in which the movement of the object at times tn ⁇ 2, tn ⁇ 1, and tn is obtained by optical flow estimation.
  • FIG. 2 shows an example in which a predetermined object moves linearly in the captured image
  • FIG. 3 shows an example in which the predetermined object moves while rotating around a predetermined position in the captured image.
  • FIG. 4 shows an example in which a plurality of objects move on a straight line extending radially from a predetermined point in the captured image.
  • the predetermined object in FIG. 2 moves linearly in the captured image
  • FIG. 2 shows, for example, a change in the position of the object obtained when the operator curves the bending portion 6 in one direction. Is shown.
  • the predetermined object in FIG. 3 moves in an arc shape in the captured image.
  • FIG. 3 shows the position of the object obtained when, for example, the operator twists and rotates the insertion portion 3 in one direction. Shows changes.
  • Each object shown in FIG. 4 is linearly moved radially from one point in the captured image.
  • FIG. 4 shows, for example, an object obtained when the operator inserts the distal rigid portion 5 into the lumen. The change of the position of is shown.
  • the target position change estimation unit 14 may determine the optical flow by setting one or more tracking points in the image in order to detect the change in the target position shown in FIG. Further, the target position change estimation unit 14 may obtain an optical flow by setting two or more tracking points in an image in order to detect a change in the target position shown in FIG. Further, the target position change estimation unit 14 may obtain an optical flow by setting three or more tracking points in an image in order to detect a change in the target position shown in FIG. As the tracking point, for example, a characteristic part such as an edge part in the captured image may be adopted, or the movement may be obtained using all the pixels in the image as the tracking point.
  • the target position change estimation unit 14 estimates the change of the target position based on the obtained change in the position of the tracking point from the start of disappearance by optical flow estimation, and outputs the estimation result to the target position estimation unit 16. For example, as shown in FIG. 2, if the tracking point is moved by a movement amount corresponding to (vxn, vyn) from time tn-1 to time tn on the screen, the target position change estimation unit 14 From time tn ⁇ 1 to time tn, it is estimated that the target position has moved by (vxn, vyn) in the same direction as the movement direction of the tracking point in FIG.
  • the target position change estimation unit 14 starts from time tn ⁇ 1. During the time tn, it is estimated that the target position is rotated by ⁇ in the same direction as the moving direction of the tracking point in FIG. Similarly, for example, as shown in FIG. 4, assuming that the tracking point has moved by (vxn, vyn) from time tn ⁇ 1 to time tn on the screen, the target position change estimation unit 14 From ⁇ 1 to time tn, the target position is estimated to have moved in the insertion direction of FIG. 4 by a movement amount corresponding to (vxn, vyn).
  • the amount of movement of the tracking point and the amount of movement of the target position are considered to be approximately proportional, but even if they are not proportional, if the approximate position or direction of the imaging target is known, the tip rigid portion will be at the target position. 5 can be moved, the object of the present embodiment can be achieved.
  • FIG. 5 is an explanatory diagram showing an example in which the movement of the tracking point is irregular.
  • FIG. 5 is described by the same method as in FIGS. 2 to 4.
  • the same pattern circles indicate the same object (tracking point).
  • the target position change estimation unit 14 may employ the following method. For example, the target position change estimation unit 14 uses the movement of the tracking point having the largest number of tracking points determined as the same movement for estimating the change of the target position. Further, for example, the target position change estimation unit 14 uses the average of movement of each tracking point for estimation of the change in the target position.
  • the target position change estimation unit 14 uses a weighted average with a weight according to the number of movements determined to be the same movement in the case where the average of movement of each tracking point is used for estimation of the change of the target position.
  • the change of the target position is estimated.
  • FIG. 6 is an explanatory diagram for explaining a method of estimating a change in target position by machine learning or deep learning.
  • the target position change estimation unit 14 may constitute an artificial intelligence 14b that realizes machine learning, deep learning, and the like.
  • the AI 14b receives a captured image before and after the relative position change of the distal rigid portion 5 with respect to the imaging target as teacher data, and generates a model for estimating a change in the target position corresponding to the change in the captured image.
  • the AI 14b estimates a change in the target position (movement vector (a, b)) when the captured image Pn of the frame n is changed to the captured image Pn + 1 in the frame n + 1, using the generated model.
  • the target position change estimation unit 14 outputs information on the change of the target position estimated in the AI 14 b to the target position estimation unit 16.
  • the target position estimation unit 16 reads position information at the start of disappearance of the captured target image from the end position storage unit 15.
  • the target position estimation unit 16 determines the current position of the imaging target based on the captured image (target) based on the position at the start of disappearance of the captured target image and the change in the target position estimated by the target position change estimation unit 14. Position).
  • the change in the target position is a change amount based on the start of disappearance of the captured target image.
  • the target position estimator 16 gives the target position estimation result to the integrated value storage unit 17 for storage and reading, and uses the target position change information from the target position change estimator 14 to obtain the target position estimation result. Update sequentially.
  • the estimated position of the imaging target is x, y
  • the position at the start of erasure of the imaging target image is x 0 , y 0
  • the estimated position x, y is obtained by the following equation (2).
  • the integrated value storage unit 17 is composed of a predetermined recording medium, and stores information from the target position estimation unit 16.
  • the target position estimation unit 16 also outputs the target position estimation result to the display control unit 18.
  • the display control unit 18 is also given a captured image of the endoscope 2 from the image processing unit 12.
  • the display control unit 18 generates display data for displaying a captured image and a display indicating the movement target direction with reference to the captured image, and outputs the display data to the display device 21.
  • the display control unit 18 may display a display indicating the current target position of the imaging target with reference to the captured image, simultaneously with the display indicating the movement target direction or instead of the display indicating the movement target direction. Good.
  • FIG. 7 is a flowchart for explaining the operation of the first embodiment.
  • 8 to 12 are explanatory diagrams for explaining the operation of the first embodiment.
  • the movement support system according to the present embodiment is used for insertion support when the insertion unit 3 of the endoscope 2 is inserted into the intestinal tract of a human body.
  • the surgeon shall insert the insertion portion 3 into the large intestine from the anus of the subject lying on the examination bed.
  • a state in the intestinal tract at the time of insertion is imaged by the imaging unit 7 provided at the distal end rigid portion 5 of the insertion unit 3.
  • An imaging signal from the imaging unit 7 is supplied to the image processing unit 12 of the movement support system main body 10 via the signal line 8.
  • the image processing unit 12 performs predetermined signal processing on the imaging signal to obtain a captured image.
  • the captured image is supplied to the display control unit 18, converted into a format that can be displayed on the display device 21, and then supplied to the display device 21.
  • the endoscope image (observation image) at the time of insertion is displayed on the display screen 21a of the eyelid display device 21.
  • FIG. 8 shows an example of a captured image (endoscopic image) displayed on the display screen 21a of the display device 21 in this case.
  • the captured image 25 in FIG. 8 indicates that the image portion of the lumen having a relatively low luminance is captured and displayed at the center of the image 25.
  • the axial direction of the distal end rigid portion 5 is substantially directed to the deep portion direction of the lumen.
  • the captured image 26 in FIG. 8 indicates that the image portion of the lumen having a relatively low luminance is captured and displayed above the image 26.
  • the captured image 27 in FIG. 8 indicates that the image portion of the lumen having a relatively low luminance is captured and displayed at the center of the image 27.
  • the image 27 shows that the lumen has a deep direction from the axial direction of the distal rigid portion 5 toward the left side with reference to the captured image displayed on the display screen 21a.
  • FIG. 9 shows the change in the position of the image portion of the lumen in the captured image, that is, the position of the captured target image on the image as the distal rigid portion 5 moves.
  • the image portion of the lumen is simplified by a circle.
  • FIG. 9 shows that the position of the image portion (imaging target image) of the lumen in the captured images P1, P2,... P6 obtained by sequentially capturing with the passage of time indicated by the arrows.
  • the imaging target image displayed on the lower side of the center of the captured image P1 moves to the lower end of the image in the captured image P2 (with the lumen, immediately before losing), In the captured image P3 (no lumen), the image departs from the image range and disappears.
  • the imaging target image moves to the lower left of the image with respect to the imaging image P4 at the imaging time of the imaging image P4 (no lumen), and the imaging image P5 At the time of imaging (without lumen), the image moves to the left of the image with reference to the captured image P5. Then, it is displayed at the left end of the captured image P6 when the captured image P6 (with a lumen) is captured.
  • the target image detection unit 13 is provided with a captured image from the image processing unit 12, and a lumen that is the captured target image is detected in step S1.
  • the target image detection unit 13 determines whether or not a captured target image exists in the captured image.
  • the target image detection unit 13 repeats steps S1 and S2 until it detects the captured target image.
  • the process proceeds to step S ⁇ b> 3, and the position (lumen coordinates) on the image of the detected imaging target image is stored in the end position storage unit 15. For example, the position of the lumen on the image P ⁇ b> 1 in FIG. 9 is stored in the end position storage unit 15.
  • the target image detection unit 13 continues detection of the imaging target image in step S4, and determines whether or not the imaging target image exists in the image in step S5.
  • the process proceeds to step S ⁇ b> 6, and the position (lumen coordinates) of the detected imaging target image on the image is stored in the end position storage unit 15.
  • the position of the lumen on the image P ⁇ b> 2 in FIG. 9 is stored in the end position storage unit 15.
  • the control unit 11 determines whether or not the stop of the lumen guide function is instructed in the next step S7. If the stop of the lumen guide function is not instructed, the control unit 11 returns the process to step S4, causes the target image detection unit 13 to continue detecting the lumen, and if the stop is instructed. The process is terminated.
  • the target image detection unit 13 shifts the process from step S5 to step S8, and outputs to the target position change estimation unit 14 disappearance start information indicating that the captured target image has disappeared.
  • the end position storage unit 15 stores the position coordinates of the center of the lower end of the image P2.
  • the target position change estimation unit 14 is provided with the captured image from the image processing unit 12, and the target position change estimation unit 14 is based on the change in the image of the captured image sequentially input in step S8. A change in the position of the image relative to the distal rigid portion 5 is detected, and information on a change in the relative target position is output to the target position estimation unit 16.
  • the target position estimation unit 16 acquires information on the position of the captured target image at the start of disappearance from the end position storage unit 15, and based on information on relative target position changes input from the target position change estimation unit 14. Thus, the current position of the imaging target image is obtained and stored in the integrated value storage unit 17 (step S9).
  • the target position estimator 16 presents the current captured target image that changes momentarily according to the position information stored in the integrated value storage 17 and the relative target position change sequentially input from the target position change estimator 14. And the current position of the imaging target image stored in the integrated value storage unit 17 is updated.
  • the estimated value of each lumen position at the time of imaging of the captured images P3, P4, and P5 of FIG. 9 is held in the integrated value storage unit 17 while being updated.
  • the target position estimation unit 16 also outputs the estimated position of the captured target image to the display control unit 18.
  • the display control unit 18 is provided with information on the estimated position coordinates of the lumen from the target position estimation unit 16 and displays the direction of the estimated position of the lumen with respect to the endoscopic image from the image processing unit 12. Generate display data.
  • the display control unit 18 outputs a composite image of the endoscopic image and the direction display to the display device 21.
  • an endoscopic image is displayed on the display screen 21a of the display device 21, and when there is no lumen image in the endoscopic image, a direction display indicating the current lumen direction is displayed. It is displayed (step S10).
  • FIG. 10 shows a display example in this case.
  • FIG. 10 shows a display example at the time of imaging of the captured image P4 of FIG. 9, and the lumen that is the imaging target at the time of insertion of the insertion unit 3 indicates the captured image P4 by the arrow display H1 toward the lower left of the captured image P4. It is located in the lower left of.
  • the display control unit 18 can display not only the direction in which the imaging target exists, but also a display indicating the relative position of the imaging target with respect to the captured image. 11 and 12 show display examples in this case.
  • a display area R1 of the captured image P11 is provided at the center on the display screen 21a, and a display area R2 for displaying the relative position of the imaging target is provided around the display area R1.
  • the display H11 indicating the relative position of the imaging target is displayed in the region R2.
  • a direction display H22 indicating the direction of the imaging target with respect to the captured image is also displayed. Show.
  • the target position change estimation unit 14 detects a relative change in the target position from the start point of disappearance of the captured target image, but detects a relative change in the target position before the disappearance start time. May start. For example, in the case where there is no problem even if the estimation error is accumulated, the detection of the relative change in the target position may be started when the captured target image enters a predetermined range at the end of the captured image. Good.
  • the change in the position of the imaging target image is estimated based on the change in the captured image.
  • a relative position is estimated, and a display indicating the position or direction can be displayed together with the captured image.
  • FIG. 13 is a flowchart showing a modification.
  • a change in relative target position is detected from the disappearance start time of the captured target image, and the relative position of the captured target is estimated.
  • the time from the disappearance start time is relatively long, or the cumulative value of the change in the target position, that is, the distance or road to the target position of the estimated result and the target position at the disappearance start time relative to the captured image is compared.
  • the estimation error of the estimated position of the imaging target may be large. (The following describes the case of determining by distance, but the same applies to the case of determining by distance.) Therefore, in these cases, an error message indicating that the estimated direction and position of the imaging target are not valid is displayed. It may be displayed or the estimation result may be hidden.
  • the control unit 11 detects these cases and causes the display control unit 18 to display an error message or the like.
  • the display control unit 18 may display an error message such as “The lumen position cannot be estimated” and the reliability of the position display and the direction display is low by blinking the direction display and the position display. May be indicated.
  • FIG. 13 is different from the flow of FIG. 7 in that steps S11, S12, and S13 are added.
  • step S11 the control unit 11 determines whether or not the integrated value stored in the integrated value storage unit 17 has exceeded a predetermined threshold value. When the integrated value exceeds a predetermined threshold value, the control unit 11 moves the process to step S13 and causes the display control unit 18 to display an error message or the like.
  • step S11 determines whether or not the integrated value does not exceed the predetermined threshold value. If it is determined in step S11 that the integrated value does not exceed the predetermined threshold value, the control unit 11 determines in step S12 whether or not the elapsed time from the disappearance start time exceeds the predetermined threshold value. . When the elapsed time from the disappearance start time exceeds a predetermined threshold, the control unit 11 shifts the processing to step S13 and causes the display control unit 18 to display an error message or the like.
  • control unit 11 determines in step S12 that the elapsed time from the disappearance start time does not exceed the predetermined threshold, the control unit 11 causes the display control unit 18 to display a normal display in the next step S10.
  • the reliability of the estimated position is relatively low when the estimated time of the relative position of the imaging target is larger than the threshold or when the distance to the estimated position is larger than the threshold. An error message is displayed. Thereby, it is possible to prevent an erroneous operation from being performed due to a display with low reliability.
  • FIG. 14 is a block diagram showing a second embodiment of the present invention.
  • the same components as those of FIG. 14 are identical to those of FIG. 14;
  • the target position is estimated from the time when the captured target image no longer exists from the captured image, that is, the disappearance start time when the disappearance is detected.
  • the target image detection unit 13 cannot detect the imaging target image. This embodiment corresponds to this case.
  • This embodiment is different from FIG. 1 in that a movement support system main body 31 using a target position estimation unit 19 in place of the target position estimation unit 16 is adopted.
  • the target image detection unit 13 detects a captured target image in the captured image and supplies information on the position in the image to the end position storage unit 15.
  • the end position storage unit 15 stores the position of the imaging target image. Further, the end position storage unit 15 stores information on an area (hereinafter referred to as a target disappearance position determination area) for determining whether the position information at the time of disappearance is valid or invalid, and the estimated target position. Information on an area for determining whether or not to output (hereinafter referred to as an output determination area) is also stored.
  • FIG. 15 is an explanatory diagram for explaining these areas.
  • the target disappearance position determination area is set to a predetermined area substantially in the center of the area (range) of the captured image.
  • the output determination area is set to the same area as the captured image area, or is set wider than the target disappearance position determination area in the captured image area.
  • FIG. 15 shows an example in which the output determination area is set wider than the target disappearance position determination area in the captured image area.
  • the target position estimation unit 19 estimates the position of the imaging target, determines whether the position at the start of estimation, that is, the position at the time of disappearance is an area outside the target disappearance position determination area, and estimates the position. It is determined whether or not the position is an area outside the output determination area.
  • the target position estimation unit 19 does not output the position estimation result to the display control unit 18 when the position at the time of disappearance is a position within the target disappearance position determination region. That is, when estimation starts from the vicinity of the center of the image, the estimated distance becomes longer. Therefore, the target position estimation unit 19 determines that the position estimation reliability of the imaging target is extremely low and is invalid, and responds to the position estimation result. The display is not displayed.
  • the target position estimation unit 19 outputs the position estimation result to the display control unit 18 only when it is outside the output determination area even when the position estimation of the imaging target is valid. If so, since the operator can recognize the location of the imaging target, the target position estimation unit 19 does not display a display according to the position estimation result. Thereby, unnecessary display can be reduced and the visibility of the operator is improved. Furthermore, when the estimated target position is in the vicinity of the center in the captured image, the direction of the estimated target position with respect to the center of the captured image is shown in FIG. Because there is, it becomes less reliable. For this reason as well, it is desirable to hide the estimated target position when it is within the output determination area. In this case, when the estimated target position is outside the output determination area, a display indicating the estimated position and direction of the imaging target is displayed on the display screen 21 a of the display device 21.
  • the output determination area may be set to the same area as the area of the captured image, but the estimated position is inside the area of the captured image due to the error of the estimated position.
  • the actual target position may be outside the area of the captured image.
  • the output determination area is set inside the area of the captured image in accordance with the accuracy of the estimated position.
  • the target position estimation unit 19 may not output the position estimation result to the display control unit 18 even when it is determined that the position at the time of disappearance is within the region of the captured image. That is, also in this case, the target position estimation unit 19 determines that the reliability of the position estimation of the imaging target is low, and does not display the display according to the position estimation result.
  • the method for hiding the estimation result as described above is desirably set according to acceptable reliability.
  • the display control unit 18 When the display control unit 18 does not display a display indicating the estimated position and direction of the imaging target at the time of disappearance, the display control unit 18 displays a display indicating that the imaging target image cannot be detected in the captured image on the display screen 21a. May be displayed.
  • the display of the estimated position of the imaging target is hidden or displayed when it is located outside the range of the captured image. It is possible to prevent the surgeon from operating erroneously with a display with relatively low reliability.
  • the imaging target is a lumen such as the intestinal tract.
  • the imaging target may not be a lumen, but may be an affected part such as a polyp.
  • the present invention may be applied not only to medical endoscopes but also to industrial endoscopes.
  • the imaging target is not limited to the lumen, and may be various inspection targets such as inspection parts or a route to reach the inspection target.
  • FIG. 16 and 17 are explanatory diagrams showing display examples when the imaging target image disappears when the imaging target is an inspection target.
  • FIG. 16 shows an example of a picked-up image obtained by picking up the inside of a pipe with an industrial endoscope (not shown). In the picked-up image 61 in the pipe, an image 63 of a lumen and an image 61a to be examined are shown. included.
  • the imaging target image is the image 61a to be inspected.
  • a captured image 62 in FIG. 16 illustrates an example in which the image 61a, which is the target image for imaging, has disappeared from the captured image.
  • the display 62a indicating that the estimated position of the inspection target that is the imaging target is in the lower left direction with respect to the captured image 62 is displayed.
  • the operator can easily grasp in which direction the insertion portion should be directed in order to confirm the inspection object.
  • FIG. 17 shows an example of a captured image obtained by capturing an image of a turbine blade using an industrial endoscope (not shown).
  • the captured image 71 includes a blade image 71a and a crack image 71b which is a defective portion. It is.
  • the imaging target image is a crack image 71b.
  • the blade image 72 a is included, but the image 71 b that is the captured target image is an example of disappearance from the captured image.
  • the display 72b indicating that the estimated position of the crack that is the imaging target is in the left direction with respect to the captured image 72 is displayed. From this display 72b, the operator can easily grasp in which direction the insertion portion should be directed in order to confirm the crack.
  • FIG. 18 is a block diagram showing a movement support system according to the third embodiment of the present invention.
  • display is performed based on the estimated position of the imaging target.
  • the movement of the moving body including the imaging unit is controlled based on the estimated position of the imaging target.
  • the third embodiment is applied to insertion control when an insertion portion of an endoscope is inserted into a subject.
  • a moving body that controls movement for example, a capsule endoscope, a catheter provided with an imaging unit, and the like can be employed as well as an insertion part of a medical or industrial endoscope. is there. Furthermore, you may employ
  • the insertion robot 51 includes arms 52L and 52R.
  • the operation unit 4 is supported by the arm 52L
  • the insertion unit 3 is supported by the arm 52R.
  • the arm 52R of the insertion robot 51 is provided with a feed / withdrawal mechanism (not shown) with respect to the insertion section 3 and a rotation mechanism
  • the arm 52L is provided with a drive mechanism for a bending operation knob (not shown) provided in the operation section 4.
  • the insertion robot 51 includes a drive control unit 53 that drives the arms 52L and 52R.
  • the drive control unit 53 of the insertion robot 51 drives the arms 52L and 52R according to the drive control signal from the movement support system main body 40.
  • the insertion robot 51 has an operation panel 54.
  • the user performs an operation of switching between start and end of automatic operation on the operation panel 54.
  • the operation that is automatically performed is not limited to all the operations.
  • the bending operation knob may be automatically operated, and the insertion / removal operation may be performed by the user with the operation panel 54 to automatically perform some operations. Absent.
  • An imaging signal obtained by an imaging unit 7 (not shown) provided at the distal end rigid portion 5 of the insertion unit 3 of the endoscope 2 is supplied to the image processing unit 12 of the movement support system main body 40 via the signal line 8.
  • the robot drive unit 41 differs from the movement support system body 10 of FIG. 1 in that a robot drive unit 41 is employed instead of the display control unit 18.
  • the robot drive unit 41 is provided with information on the position of the captured target image on the captured image from the target image detection unit 13 and information on the estimated position of the imaging target from the target position estimation unit 16.
  • the robot drive unit 41 generates a drive control signal for driving the arms 52L and 52R of the insertion robot 51 and outputs the drive control signal to the drive control unit 53 so that the imaging target image is included in the imaging range.
  • the robot drive unit 41 performs drive control of the arms 52L and 52R so that the captured target image is captured, and when the captured target image disappears from the captured image, the imaging target from the target position estimation unit 16 Based on the position estimation result, the arms 52L and 52R are driven to perform drive control so that the target image is captured.
  • the insertion portion 3 is smoothly inserted into the intestinal tract or the like.
  • the insertion unit is controlled in accordance with the target image in the captured image, but also when the target image disappears from the captured image, the captured image is used as a reference.
  • the position of the imaging target is estimated, and the insertion of the insertion portion is controlled based on the estimation result, and effective insertion support that reliably guides the insertion portion in the deep direction of the lumen or the like is possible.
  • control unit 11, image processing unit 12, target image detection unit 13, target position change estimation unit 14, target position estimation units 16 and 19, display control unit 18, and the like in the above embodiment include a dedicated circuit or a plurality of circuits.
  • These general-purpose circuits may be combined and, if necessary, may be configured by combining a microprocessor that operates in accordance with pre-programmed software and a processor such as a CPU, or a sequencer.
  • a design in which an external device takes over part or all of the control is possible, and in this case, a wired or wireless communication circuit is interposed.
  • An embodiment in which an external device such as a server or a personal computer performs the characteristic processing and supplementary processing of each embodiment is also assumed.
  • the present application covers a case where a plurality of devices cooperate to establish the characteristics of the present invention.
  • Bluetooth registered trademark
  • Wi-Fi registered trademark
  • a telephone line or the like is used.
  • the communication at this time may be performed by USB or the like.
  • a dedicated circuit, a general-purpose circuit, or a control unit may be integrated into an ASIC.
  • the controls and functions mainly described in the flowcharts can be set by a program, and the above-described controls and functions can be realized by a computer reading and executing the program. it can.
  • the program may be recorded or stored in whole or in part on a portable medium such as a non-volatile memory such as a flexible disk or a CD-ROM, or a storage medium such as a hard disk or a volatile memory. It can be distributed or provided at the time of product shipment or via a portable medium or communication line.
  • the user can easily realize the movement support system of the present embodiment by downloading the program via a communication network and installing it on a computer, or installing it from a recording medium to a computer. .
  • the present invention is not limited to the above-described embodiments as they are, and can be embodied by modifying constituent elements without departing from the scope of the invention in the implementation stage.
  • various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above embodiments. For example, you may delete some components of all the components shown by embodiment.
  • constituent elements over different embodiments may be appropriately combined.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un système d'aide au déplacement comprenant : une unité de détection d'image cible qui reçoit une image capturée acquise par une unité de capture d'image fixée à un corps mobile et qui détecte une image d'une cible de capture comprise dans l'image capturée et la position de l'image détectée de la cible de capture dans l'image capturée ; une unité de mémorisation de position permettant de mémoriser la position de l'image de la cible de capture, détectée par l'unité de détection d'image cible, dans l'image capturée ; une unité d'estimation de changement de position de cible qui reçoit des images capturées et détecte un changement dans les images capturées afin d'estimer un changement de la position de la cible de capture par rapport à l'image capturée ; et une unité d'estimation de position de cible permettant d'estimer la position de la cible de capture par rapport à l'image capturée et émettre en sortie un résultat de l'estimation de position lorsqu'il est indiqué qu'un état de disparition, constituant un état dans lequel l'unité de détection d'image de cible ne détecte plus l'image de la cible de capture dans l'image capturée, s'est produit, ladite estimation de la position de la cible de capture étant réalisée en fonction d'informations de la position mémorisées dans l'unité de mémorisation de position et du résultat de l'estimation du changement de position de la cible de capture obtenue par l'unité d'estimation de changement de position de cible.
PCT/JP2018/017097 2018-04-26 2018-04-26 Système d'aide au déplacement et procédé d'aide au déplacement Ceased WO2019207740A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2018/017097 WO2019207740A1 (fr) 2018-04-26 2018-04-26 Système d'aide au déplacement et procédé d'aide au déplacement
JP2020515412A JP7093833B2 (ja) 2018-04-26 2018-04-26 移動支援システム及び移動支援方法
US17/079,610 US11812925B2 (en) 2018-04-26 2020-10-26 Movement assistance system and movement assistance method for controlling output of position estimation result

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/017097 WO2019207740A1 (fr) 2018-04-26 2018-04-26 Système d'aide au déplacement et procédé d'aide au déplacement

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/079,610 Continuation US11812925B2 (en) 2018-04-26 2020-10-26 Movement assistance system and movement assistance method for controlling output of position estimation result

Publications (1)

Publication Number Publication Date
WO2019207740A1 true WO2019207740A1 (fr) 2019-10-31

Family

ID=68293978

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/017097 Ceased WO2019207740A1 (fr) 2018-04-26 2018-04-26 Système d'aide au déplacement et procédé d'aide au déplacement

Country Status (3)

Country Link
US (1) US11812925B2 (fr)
JP (1) JP7093833B2 (fr)
WO (1) WO2019207740A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024018713A1 (fr) * 2022-07-19 2024-01-25 富士フイルム株式会社 Dispositif de traitement d'image, dispositif d'affichage, dispositif d'endoscope, procédé de traitement d'image, programme de traitement d'image, modèle entraîné, procédé de génération de modèle entraîné et programme de génération de modèle entraîné
WO2024029502A1 (fr) * 2022-08-01 2024-02-08 日本電気株式会社 Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique et support d'enregistrement

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003093328A (ja) * 2001-09-25 2003-04-02 Olympus Optical Co Ltd 内視鏡挿入方向検出方法及び内視鏡挿入方向検出装置
JP2011224038A (ja) * 2010-04-15 2011-11-10 Olympus Corp 画像処理装置及びプログラム
JP2012170641A (ja) * 2011-02-22 2012-09-10 Olympus Corp 蛍光観察装置
WO2017006449A1 (fr) * 2015-07-08 2017-01-12 オリンパス株式会社 Appareil d'endoscopie

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5063023B2 (ja) * 2006-03-31 2012-10-31 キヤノン株式会社 位置姿勢補正装置、位置姿勢補正方法
JP5094036B2 (ja) * 2006-04-17 2012-12-12 オリンパスメディカルシステムズ株式会社 内視鏡挿入方向検出装置
US9718190B2 (en) * 2006-06-29 2017-08-01 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
JP4961475B2 (ja) 2007-06-20 2012-06-27 オリンパスメディカルシステムズ株式会社 内視鏡システム
US9295372B2 (en) * 2013-09-18 2016-03-29 Cerner Innovation, Inc. Marking and tracking an area of interest during endoscopy
JP6257371B2 (ja) * 2014-02-21 2018-01-10 オリンパス株式会社 内視鏡システム及び内視鏡システムの作動方法
US10932657B2 (en) * 2014-04-02 2021-03-02 Transenterix Europe S.A.R.L. Endoscope with wide angle lens and adjustable view
DE102017103198A1 (de) * 2017-02-16 2018-08-16 avateramedical GmBH Vorrichtung zum Festlegen und Wiederauffinden eines Bezugspunkts während eines chirurgischen Eingriffs
JP6833978B2 (ja) * 2017-03-30 2021-02-24 富士フイルム株式会社 内視鏡システム、プロセッサ装置、及び、内視鏡システムの作動方法
US11553829B2 (en) * 2017-05-25 2023-01-17 Nec Corporation Information processing apparatus, control method and program
US10346978B2 (en) * 2017-08-04 2019-07-09 Capsovision Inc. Method and apparatus for area or volume of object of interest from gastrointestinal images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003093328A (ja) * 2001-09-25 2003-04-02 Olympus Optical Co Ltd 内視鏡挿入方向検出方法及び内視鏡挿入方向検出装置
JP2011224038A (ja) * 2010-04-15 2011-11-10 Olympus Corp 画像処理装置及びプログラム
JP2012170641A (ja) * 2011-02-22 2012-09-10 Olympus Corp 蛍光観察装置
WO2017006449A1 (fr) * 2015-07-08 2017-01-12 オリンパス株式会社 Appareil d'endoscopie

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024018713A1 (fr) * 2022-07-19 2024-01-25 富士フイルム株式会社 Dispositif de traitement d'image, dispositif d'affichage, dispositif d'endoscope, procédé de traitement d'image, programme de traitement d'image, modèle entraîné, procédé de génération de modèle entraîné et programme de génération de modèle entraîné
WO2024029502A1 (fr) * 2022-08-01 2024-02-08 日本電気株式会社 Dispositif d'aide à l'examen endoscopique, procédé d'aide à l'examen endoscopique et support d'enregistrement
WO2024028934A1 (fr) * 2022-08-01 2024-02-08 日本電気株式会社 Dispositif d'assistance d'endoscopie, procédé d'assistance d'endoscopie et support d'enregistrement
JPWO2024029502A1 (fr) * 2022-08-01 2024-02-08
JP7768398B2 (ja) 2022-08-01 2025-11-12 日本電気株式会社 内視鏡検査支援装置、内視鏡検査支援方法、及び、プログラム

Also Published As

Publication number Publication date
US11812925B2 (en) 2023-11-14
JPWO2019207740A1 (ja) 2021-02-12
US20210052136A1 (en) 2021-02-25
JP7093833B2 (ja) 2022-06-30

Similar Documents

Publication Publication Date Title
US10932875B2 (en) Manipulator, medical system, and medical system control method
US7641609B2 (en) Endoscope device and navigation method for endoscope device
JP4733243B2 (ja) 生検支援システム
JP7385731B2 (ja) 内視鏡システム、画像処理装置の作動方法及び内視鏡
EP1543765A1 (fr) Systeme de traitement medical, systeme d'endoscope, programme d'operation d'insertion d'endoscope, et dispositif d'endoscope
JPWO2008155828A1 (ja) 内視鏡システム、撮像システム及び画像処理装置
JP6600690B2 (ja) 挿入体支援システム
JP7007478B2 (ja) 内視鏡システムおよび内視鏡システムの作動方法
WO2012068194A2 (fr) Guidage d'endoscope basé sur la correspondance d'images
US20230148847A1 (en) Information processing system, medical system and cannulation method
JP4436638B2 (ja) 内視鏡装置及び内視鏡挿入動作プログラム
US12299922B2 (en) Luminal structure calculation apparatus, creation method for luminal structure information, and non-transitory recording medium recording luminal structure information creation program
US20240016366A1 (en) Image diagnosis system for lesion
US11812925B2 (en) Movement assistance system and movement assistance method for controlling output of position estimation result
JP6203456B2 (ja) 超音波観測装置、超音波観測装置の作動方法および超音波観測装置の作動プログラム
JPWO2022202401A5 (fr)
US20240062471A1 (en) Image processing apparatus, endoscope apparatus, and image processing method
JP7441934B2 (ja) 処理装置、内視鏡システム及び処理装置の作動方法
US20240000299A1 (en) Image processing apparatus, image processing method, and program
JP4323515B2 (ja) 内視鏡システム
US20240112407A1 (en) System, methods, and storage mediums for reliable ureteroscopes and/or for imaging
JP7544945B1 (ja) 内視鏡システム及び内視鏡システムの作動方法
JP4262741B2 (ja) 内視鏡形状検出装置
KR102875761B1 (ko) 상부 위장관 이미지 획득을 위한 내시경 장치 및 제어 방법
WO2024028934A1 (fr) Dispositif d'assistance d'endoscopie, procédé d'assistance d'endoscopie et support d'enregistrement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18916776

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020515412

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18916776

Country of ref document: EP

Kind code of ref document: A1