WO2023218523A1 - Second système endoscopique, premier système endoscopique et procédé d'inspection endoscopique - Google Patents
Second système endoscopique, premier système endoscopique et procédé d'inspection endoscopique Download PDFInfo
- Publication number
- WO2023218523A1 WO2023218523A1 PCT/JP2022/019794 JP2022019794W WO2023218523A1 WO 2023218523 A1 WO2023218523 A1 WO 2023218523A1 JP 2022019794 W JP2022019794 W JP 2022019794W WO 2023218523 A1 WO2023218523 A1 WO 2023218523A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- endoscope system
- image
- operation unit
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
Definitions
- the present invention provides a second endoscope system capable of acquiring time-series operation content information from a first endoscope system and performing operations for examination based on this operation content information. , a first endoscope system therefor, and an endoscope inspection method.
- Patent Document 1 describes, in performing dental treatment, an imaging unit that photographs the inside of the oral cavity of a patient to be treated, a storage unit that stores a plurality of product information and processing procedure information for each application of each product, and a photographed image. a photographed image analysis section that detects the treatment target in the image and identifies the treatment steps; a processing procedure control section that selects the processing procedure corresponding to the treatment step; and a display control section that displays the processing procedure and the photographed images side by side. , a dental treatment support device is disclosed.
- Patent Document 1 describes displaying the processing procedure during treatment, and the dentist or the like can perform the treatment according to the displayed processing procedure.
- the target area for treatment can be easily found.
- the present invention has been made in view of the above circumstances, and provides a second endoscope system, a first endoscope system, and an endoscope system that allows easy access to target areas such as affected areas.
- the purpose is to provide a mirror inspection method.
- a second endoscope system provides an observation method for a subject who has undergone an organ examination using the first endoscope system.
- a second endoscope system for observing a target organ an acquisition unit that acquires time-series operation content information in the first endoscope system as operation unit information; an insertion operation determination unit that estimates an operation process when undergoing an examination using the second endoscope system, and compares the operation process estimated by the insertion operation determination unit with the operation unit information, and an operation guide unit that outputs operation guide information for operating the second endoscope system in order to observe a characteristic site in the target organ with the second endoscope system;
- the operation unit information is image change information estimated using the asymmetry of the organ to be observed.
- a second endoscope system is characterized in that, in the first invention, the asymmetry information of the organ to be observed is determined based on the anatomical positional relationship of a plurality of parts within the specific organ. .
- the operation unit information is information regarding an operation that continues for a predetermined period of time.
- a second endoscope system is the third invention, wherein the operation unit information is information regarding an operation start image and operations from the start to the end of the operation.
- a second endoscope system is characterized in that, in the first invention, the operation guide information outputted by the operation guide section identifies the characteristic part of the observation target organ. This is guide information for observing under observation conditions similar to those of the endoscopy system.
- the operation unit information is image change information indicating a series of the same operations.
- a second endoscope system in the first invention, determines the first direction when detecting asymmetry of the organ to be observed.
- a second endoscope system according to an eighth invention in the first invention, detects the asymmetry of the organ to be observed by detecting the direction in which the liquid accumulates, which is determined by the direction of gravity, or the already detected internal structure. Refers to the direction determined by the positional relationship.
- a second endoscope system according to a ninth invention is a second endoscope system according to the first invention, wherein the operation unit information reflects an angle at which a lever or a knob for rotating a distal end portion of the endoscope system is turned. to be determined.
- a second endoscope system according to a tenth invention is characterized in that in the first invention, the operation unit information is information whose operation unit is a process until the observation direction of the distal end of the endoscope system changes. be.
- a second endoscope system according to an eleventh invention is a second endoscope system according to the tenth invention, in which the observation direction of the distal end of the endoscope system is determined by twisting the endoscope system or by It changes by angulating the mirror system or by pushing the endoscopic system into the body.
- the operation unit information is information in which the unit of operation is a process until the shape of the organ to be observed changes.
- a second endoscope system according to a thirteenth invention is a second endoscope system according to the twelfth invention, wherein the operation unit information is obtained by air supply, water supply, and/or suction using the endoscope system. , or the process by which the shape of an organ is estimated to change by pushing the endoscope system into it is information whose operation unit is the process of changing the shape of an estimated organ.
- a second endoscope system according to a fourteenth invention is a second endoscope system according to the twelfth invention, in which the operation unit information includes dispersing a pigment and/or a stain using the first endoscope system. This is information whose operation unit is a process until the state of the mucous membrane of an organ is estimated to change by performing water supply by using the first endoscope system or by using the first endoscope system.
- a second endoscope system in the first invention, includes an operation guide for operating the second endoscope system to observe a characteristic site in the organ to be observed.
- the information is determined by comparing a plurality of pieces of operation unit information, and if the overlapping parts do not require follow-up observation, the corresponding The information is corrected and compared to the operation unit information excluding the operations of the duplicate parts.
- a second endoscope system according to a sixteenth invention, in the first invention, performs observation of a characteristic part of the observation target organ in the same manner as the first endoscope system, based on the operation unit information. Observe by automatic operation under certain conditions.
- the endoscopic examination method provides an endoscopic examination method for examining a subject using a second endoscope system for examining an organ using a first endoscope system.
- time-series operation content information in the first endoscope system is acquired as operation unit information
- the second The operation process when undergoing an examination using the endoscope system is estimated, the estimated operation process is compared with the operation unit information, and the characteristic parts of the organ to be observed are detected using the second endoscope system.
- Operation guide information for operating the second endoscope system for observation is output, and the operation unit information is image change information estimated using the asymmetry of the observation target organ.
- the first endoscope system includes an input unit for inputting images of organs of a subject in chronological order, and an input unit for dividing images of the organs obtained in chronological order into operation units, an operation unit determination unit that determines the operation performed for each unit; and a recording unit that records information regarding the image and endoscope operation in the operation unit as operation unit information for each operation unit determined by the operation unit determination unit. and an output section that outputs the operation unit information recorded in the recording section.
- the operation unit determination unit determines the distal end of the first endoscope based on the image acquired by the imaging unit. The operation is divided into the above operation units based on whether at least one of the insertion direction, rotation direction, and bending direction has changed. In the first endoscope system according to a twentieth invention, in the eighteenth invention, the operation unit determination unit determines the operation based on the asymmetry of the anatomical structure with respect to the image acquired by the imaging unit. Determine the direction of.
- the recording unit records a start image and an end image among the continuous images belonging to the operation unit, and Record the operation information indicating the operation status in.
- the recording section records operation information after a target object serving as a landmark near the target is discovered.
- the endoscopy method acquires images of organs of a subject in chronological order, divides the images of the organs acquired in chronological order into operation units, and performs a first operation for each operation unit.
- the operation performed by the endoscope is determined, and for each determined operation unit, the image and information regarding the endoscope operation in the operation unit are recorded in the recording unit as operation unit information, and the information recorded in the recording unit is recorded. Output the above operation unit information.
- the second endoscope system according to the twenty-fourth invention provides a second endoscope system for observing the organs of a subject who has undergone an organ examination using the first endoscope system.
- the endoscope system includes an input section for inputting recorded operation unit information for a subject who has been examined using the first endoscope system, and an input section for inputting recorded operation unit information, and an input section for inputting images of the subject's organs in chronological order. divides the above-mentioned images acquired in chronological order into operation units, estimates the operation state of the second endoscope system for each operation unit, and compares the estimated operation state with the above-mentioned operation. and an operation guide unit that compares the unit information and outputs guide information for observation under the same observation conditions as the first endoscope system.
- the program according to the twenty-fifth invention is configured to provide an organ to be observed using a second endoscope system for a subject who has had an organ examined using a first endoscope system.
- the observation computer obtains time-series operation content information in the first endoscope system as operation unit information, and performs an examination on the subject using the second endoscope system. estimating the operation process when undergoing the operation, comparing the estimated operation process with the operation unit information, and observing the characteristic parts of the observation target organ under the same observation conditions as the first endoscope system.
- the program according to the twenty-sixth invention acquires images of organs of a subject in chronological order, divides the images of the organs acquired in chronological order into operation units, and sets a first endoscope for each operation unit. For each determined operation unit, the image and information regarding the endoscope operation in the operation unit are recorded in a recording unit as operation unit information, and the operation unit information recorded in the recording unit is recorded. Output something, make a computer do something.
- a second endoscope system it is possible to provide a second endoscope system, a first endoscope system, and an endoscopy method that allow easy access to a target site such as an affected area.
- FIG. 1 is a block diagram mainly showing the electrical configuration of an endoscope system according to an embodiment of the present invention.
- 1 is a block diagram mainly showing the electrical configuration of an endoscope system according to an embodiment of the present invention.
- FIG. 3 is a diagram illustrating an example of a route taken to reach an object serving as a landmark such as an affected area in an endoscope system according to an embodiment of the present invention.
- the endoscope system according to one embodiment of the present invention it is a flowchart showing the operation in the first endoscope system.
- the endoscope system according to one embodiment of the present invention it is a flowchart showing the operation in the second endoscope system.
- FIG. 3 is a diagram showing a process of inserting an endoscope in an endoscope system according to an embodiment of the present invention.
- FIG. 3 is a diagram showing a process of inserting an endoscope in an endoscope system according to an embodiment of the present invention.
- FIG. 2 is a diagram showing an example of a captured image when an endoscope is inserted in an endoscope system according to an embodiment of the present invention.
- FIG. 2 is a diagram showing an example of a captured image when an endoscope is inserted in an endoscope system according to an embodiment of the present invention.
- This endoscope system is ideal for examinations, examinations, and treatments by endoscopists (in this specification, examinations, examinations, and treatments may be collectively referred to as examinations or examinations, etc.), and reproduces examinations, etc.
- examinations or examinations, etc. examinations or examinations, etc.
- reproduces examinations, etc. We are making it possible to record and communicate information that can help. Furthermore, in order to reproduce the examination performed by an endoscopist, images during the examination are recorded and image changes and image characteristics are utilized.
- the operating state of the endoscopist is determined based on the following image changes. (1) If the change pattern of the inspection image is constant (for example, an image of driving through a tunnel), it is determined that the endoscope is moving straight; (2) the change pattern is unusual. If the rotation or twist is applied to the endoscope, it is determined that the endoscope has been rotated or twisted.
- up, down, left, and right are defined using the structure of the organ to be observed with the endoscope (for example, when viewed from the endoscope, the throat The vocal cord side is lower; in the stomach, the gastric angle side is upper).
- the position (direction) in an endoscopic image is displayed using the anatomical normal position (the anatomical normal position will be described later using FIG. 5). Using these clues, even non-endoscopy specialists can reproduce ideal examinations.
- Tg is a target site such as an affected area discovered by a specialist, and a non-specialist operates the endoscope so as to reach this target site Tg.
- Ob is an object that serves as a landmark on the route to reach the target site Tg. When this landmark Ob is found, the target region Tg is searched for using this landmark Ob as a clue. Although only one target region is depicted in FIG. 2, there may be a plurality of target regions.
- a specialist operates the first endoscope system, advances straight along route R1, and bends the endoscope at position L1. Change the direction of travel by rotating or rotating the vehicle. Thereafter, the endoscope is further advanced along route R2, and at position L2, the endoscope is bent or rotated to change the direction of travel to route R3. Proceeding in this state, the image of the landmark Ob is captured at position L3, the route is changed to route R4 at position L4, and the target area Tg, such as the affected area, is finally discovered.
- one operation unit is from the start of operation to position L1 along route R1
- one operation unit is from position L1 to position L2 along route R2
- one operation unit is from position L2 to position L2 along route R2.
- One operation unit is from position L3 to position L4 along route R4, and one operation unit is from position L4 to target region Tg along route R5. It is a unit of operation.
- the explanation is given by vertical and horizontal movement in two dimensions. However, since the object actually moves within a three-dimensional structure, operations involving rotation of the screen and operations that change the viewing position vertically and horizontally can also be assumed.
- an operating guide (operating advice) can be generated based on this information.
- a non-specialist can easily reach the target site Tg by operating the second endoscope system while receiving an operating guide.
- the second endoscope system uses the second endoscope system to perform an examination on the same patient (subject) on whom a specialist has performed an examination, by receiving the operation guide, the The target region Tg shown can be easily reached.
- operation guide information such as rotation and bending is displayed, and from then on, position L2 is displayed.
- position L3 position of landmark Ob
- position L4 position of landmark Ob
- Reference guide information is also displayed at locations other than this.
- the anatomical normal position is used to display the direction in the image. conduct.
- the specialist when looking at the lesser curvature side after insertion into the stomach, there are cases where the patient first looks at the greater curvature side from the cardia and then goes to the lesser curvature. If the final follow-up observation is on the lesser curvature side, the specialist (expert) will omit the part seen in the greater curvature, and only enter from the cardia and turn toward the lesser curvature (i.e., from the insertion of the cardia to the greater curvature). The operation of swinging back to the cardia and returning to the vicinity of the cardia insertion is omitted) may be recorded as a unit of operation.
- Operation unit information that remains as a history is included in the history if a specific guide starting point can be reset, such as ⁇ go forward and then go back'' or ⁇ look to the right and then look to the left.'' A plurality of pieces of operation unit information may be corrected to generate guide operation unit information in which the guide start point is reset, and this operation unit information may be referred to for guidance.
- this operation unit information may be referred to for guidance.
- the operation unit information may be corrected and compared.
- the operation unit information for the guide may be strictly corrected to create new data corresponding to the operation unit information for comparison, or the guide may be issued in anticipation without going to the trouble of creating new data.
- FIG. 5 A method for displaying directions within an image based on this anatomical orientation will be explained using FIG. 5.
- the tip of the endoscope is cylindrical, and the imaging unit is placed inside it, and images of the inside of the digestive tract, which has a complicated shape, are captured in either direction, upward or downward, or to the right. It's hard to tell if it's to the left.
- images of the inside of the digestive tract which has a complicated shape, are captured in either direction, upward or downward, or to the right. It's hard to tell if it's to the left.
- in general landscape photographs, portrait photographs, etc. it is possible to understand from the image which side of the screen is upward or downward, or whether it is forward or backward, whereas the digestive tract, etc. It is difficult to determine the direction from an image of the interior, and some definition is needed to represent the direction.
- Anatomical position is used to represent the direction.
- Anatomical upright position is a posture in which you stand straight with your palms facing forward (the direction your face is facing), and directions are expressed based on anatomical upright position. This premise is especially useful when expressing parts that easily change direction, such as limbs. However, even if anatomical orientation is assumed, representations of the directions of limbs, brains, etc. tend to cause confusion, so easy-to-understand representations as described below are preferred.
- the direction of the head is superior (superior) and the direction of feet is inferior (inferior).
- left and right are expressed as left and right as seen from the person being observed. That is, when a doctor is facing a patient, the left half of the patient's body is on the right side as viewed from the doctor. If a doctor is observing a patient's back, the right side of the patient's body is on the right side as seen from the doctor.
- the side facing the face is the front (anterior)
- the side facing the back is the back (posterior).
- Figure 5 shows the anatomical direction according to the normal position. Note that when gastric endoscopy is performed, the whole body is actually turned sideways (left lateral position), but in Figure 5, for convenience of drawing, the head is turned sideways (left side) and the head is turned sideways (left lateral position). The lower part of the neck is drawn facing forward.
- the insertion route Ro of the endoscope is shown by a broken line. The distal end of the endoscope is inserted from the oral cavity OC (depending on the model of the endoscope, it may be inserted from the nasal cavity NC), passes through the esophagus ES, and advances to the stomach St.
- Image P5A in FIG. 5 is an image before entering the esophagus ES when the distal end of the endoscope is inserted from the oral cavity OC.
- the vocal cords VC and the trachea Tr are on the lower side in the anatomically normal position, and the trachea Tr is on the front side, and the esophagus ES is on the upper side of the screen and on the back side in the anatomically normal position.
- FIG. 5 when the distal end of the endoscope advances in the direction of the duodenum, it advances toward the pylorus Py.
- the distal end of the endoscope may be advanced along the wall surface of the stomach St, and the endoscope may be turned in a direction in which the pylorus Py can be seen.
- Image P5B shown in FIG. 5 is an image of the pylorus Py viewed from the side. When this image P5B is visible, it is sufficient to perform a bending operation on the tip of the endoscope to change the direction of the tip.
- a specialist operates the first endoscope system 10A and records information until reaching the target site Tg such as an affected area, and a non-specialist performs an examination on the same subject.
- operation guide information is displayed based on the recorded information.
- FIG. 6 shows how the endoscope EDS is inserted from the oral cavity OC of the subject, passes through the stomach St of the subject, and inspects the pylorus Py. Note that in FIG. 6(a), similarly to FIG. 5, for convenience of drawing, the head is drawn sideways (facing the left), and the lower part of the neck is drawn forward.
- FIG. 6(a) shows the endoscope EDS being inserted into the oral cavity OC
- FIG. 6(b) shows the endoscope EDS being inserted into the esophagus ES and stomach St.
- FIG. 7 shows images P6a to P6f acquired when the endoscope EDS is inserted into the digestive tract, and these images change from moment to moment.
- Images P6a to P6c show the endoscope EDS being inserted into the esophagus ES at times T1 to T3, and at this time, the tip of the endoscope EDS is rotated or bent. It's not done and it's going straight. For this reason, the oval shape (hole shape) of the esophagus ES gradually becomes larger.
- Images P6a to P6c are images in the first operation unit.
- the endoscope EDS is rotated, and in the image acquired at this time, the elliptical (hole-shaped) protrusion portion is rotated.
- Images P6c to P6d are images in the second operation unit.
- the reason why the distal end of the endoscope EDS rotates from time T3 to time T4 is to search for the pylorus Py in the stomach St. That is, when the distal end of the endoscope EDS moves downward a predetermined distance along the wall surface of the stomach St, it reaches the vicinity of the pylorus Py, so at this timing, the distal end of the endoscope EDS is rotated. Find Pylorus Py. If the pylorus Py is found, it can be advanced to the duodenum.
- images P6e to P6f are images of the third operation unit.
- the operation unit information is image change information indicating a succession of the same motions estimated using the asymmetry of the observed organ.
- the operation unit information is image change information indicating a succession of the same motions estimated using the asymmetry of the observed organ.
- FIG. 7 shows the case where each of the straight operation, rotation operation, and bending operation is performed independently.
- multiple operations may be performed in a complex manner, and even in this case, by taking advantage of the asymmetry of internal organs, they can be broken down into individual operations. , just obtain the operation information.
- endoscopes will be developed that can perform bending operations other than the tip, as well as endoscopes that can bend in directions other than up, down, left, and right, and that will have functions similar to zoom lenses and will be able to control the tip toward and away from the tip.
- this embodiment can be applied in the same way in that case as well.
- the operation unit information does not need to be limited to operations related to changes in the observation position of the endoscope tip.
- the following operations are frequently performed when observing a target region using an endoscope system, and should be considered as a unit of operation. For example, by dispersing a pigment or stain, the shape of irregularities in the target region and the difference between a lesion and a normal region may be clearly visible.
- Visibility may also be improved by supplying water (for cleaning mucus, etc.) using an endoscope system.
- the estimated state of the mucous membrane of an organ may change due to active actions other than changing the position, and it is also important to use the process until the state of the mucous membrane of the organ changes as a unit of operation.
- the operation unit information is image information indicating a series of the same operations.
- the same action is simply inserting this amount, twisting and rotating this much, turning the knob this much to bend the tip (in Figure 7, "insertion direction, rotation, bending the tip”), etc.
- the "same operations" are classified in too short a period of time, the operation instructions may become too fragmented and difficult to understand.
- the guide becomes uneasy during the operation.
- the same operation be divided into periods of time (for example, from several seconds to several tens of seconds) that can be easily operated by the operator of the second endoscope system while referring to the guide.
- experienced experts can twist the device while inserting it, so it may be helpful to guide them in an easy-to-understand manner.
- the guide may be divided into two components, insertion and twisting, in a time-sharing manner.
- FIG. 6 shows the route for inserting the endoscope EDS from the oral cavity OC toward the pylorus Py.
- FIG. 8 shows an example of an image acquired by the endoscope EDS during this insertion.
- Image P11 is an image when the vocal cords VC are viewed from above
- image P12 is an image when the esophagus ES is viewed from above.
- Image P13 is an image when entering the stomach St
- image P14 is an image when the pylorus Py is viewed from above
- image P15 is an image when the pylorus Py is viewed from the side.
- internal organs are not symmetrical but asymmetrical, and by utilizing this asymmetry, the transition of operation during continuous motion can be estimated and the break in operation is taken as a unit. The unit of operation can be determined.
- operation unit information is recorded for each operation unit (for example, 1A, operation unit information 35b, see S11 in FIG. 3).
- This operation unit information can be said to be image change information indicating a succession of the same motions estimated using the asymmetry of the observed organ (for example, see FIG. 7).
- the asymmetry information of the organ to be observed is determined based on the anatomical positional relationship of a plurality of parts within the specific organ. It may be adapted to match the anatomical orientation representation.
- the operation unit information is information regarding an operation that continues for a predetermined period of time.
- the operation unit information may be information regarding an operation start image and operations from the start to the end of the operation.
- the operation unit information may include an end image, and/or information serving as a landmark for discovering the target region, and/or information regarding the target region, and/or pre-discovery operation information (for example, in FIG. (See S41, S48).
- the operation unit information may be determined by reflecting the angle at which a lever or knob for rotating the distal end of the endoscope system is turned. Further, the operation unit information may be information in which the operation unit is a process until the observation direction of the distal end of the endoscope system changes. The viewing direction of the distal end of the endoscopic system may be changed by twisting the endoscopic system, by angling the endoscopic system, or by pushing the endoscopic system into the body. .
- the operation unit information may be information in which the operation unit is a process until the shape of the organ to be observed changes.
- Operation unit information is information in which the operation unit is the process of changing the shape of an estimated organ by supplying air, water, or suction using an endoscope system, or by pushing the endoscope system. It may be.
- the operation unit information is estimated by spraying a pigment/staining agent using the first endoscope system or by delivering water (for mucous membrane cleaning) using the first endoscope system. This information is based on the process of changing the state of the mucous membrane of the organ being treated as a unit of operation.
- the operation unit information is not limited to operations related to changes in the observation position of the tip of the endoscope, but also changes in the observation state, visibility, etc. by changing the state of the target part or something blocking it during observation with the endoscope. Operations that improve detectability may also be included in the operation unit information.
- the anatomical normal position is used to represent the direction. Therefore, the first direction may be determined when detecting the asymmetry of the organ to be observed. Furthermore, in detecting the asymmetry of the organ to be observed, reference may be made to the direction in which liquid accumulates, which is determined by the direction of gravity, or the direction determined by the positional relationship of already detected structures within the body.
- the present invention is not limited to this, and characteristic parts of the observation target organ may be observed by automatic operation under the same observation conditions as the first endoscope system based on the operation unit information.
- the characteristic parts of the organ to be observed are Guide information is output so that observation can be performed under the same observation conditions as the endoscope system No. 1 (for example, see S37 and S47 in FIG. 4).
- similar observation conditions include the size of the object photographed within the screen, the angle of view, etc., and the positional relationship between the imaging unit and the observation object when observing the observation object is the same. This is the condition for making it.
- FIGS. 1A and 1B This endoscope system provides information on the organs to be observed by a subject (including a patient) who has undergone an organ examination (including diagnosis and treatment) using the first endoscope system. and a second endoscope system for observing.
- the endoscope system according to this embodiment includes an endoscope system 10A, an auxiliary device 30 provided in a hospital system server, etc., and a second endoscope system 10B.
- the endoscope system 10A and the second endoscope system 10B are endoscopes that are inserted from the oral cavity through the esophagus to examine the stomach or duodenum.
- the endoscope system 10A is an endoscope used for the first examination of the subject
- the second endoscope system 10B is an endoscope used for the second and subsequent examinations of the subject. explain.
- the endoscope system 10A and the second endoscope system 10B may be the same model of endoscope, but will be described here as different models.
- the second endoscope system 10B is the same model as the endoscope system 10A, it may be the same device or may be a different model/device.
- changes in the patient's physical and health conditions such as changes in the patient's affected area, physical and mental constraints such as changes in doctors, fatigue and habituation, or changes in availability, assistants, peripheral equipment, environment, etc.
- the results may not be exactly the same due to changes in surrounding conditions (including surrounding circumstances). Therefore, in this embodiment, information can be inherited when multiple tests are performed at different timings (in many cases, the test dates are different, but same-day retests can be assumed). It would be good if you could.
- the endoscope system 10A is used by a doctor to observe the inside of the pharynx, esophagus, stomach, and duodenum, and perform tests, treatments, surgeries, etc.
- This endoscope system 10A includes a control section 11A, an imaging section 12A, a light source section 13A, a display section 14A, an ID management section 15A, a recording section 16A, an operation section 17A, an inference engine 18A, a clock section 20A, and a communication section 21A. are doing.
- each of the above-mentioned parts may be provided in an integrated device, but may also be distributed and arranged in a plurality of devices.
- the control unit 11A is composed of one or more processors having a processing device such as a CPU (Central Processing Unit), a memory storing a program (the program may be stored in the recording unit 16A), etc., and executes the program. and controls each part within the endoscope system 10A.
- the CPU of the control unit 11A executes the program in cooperation with the CPU of the control unit 31 of the auxiliary device 30, and realizes the flow operation shown in FIG.
- the control unit 11A performs various controls when the endoscope system 10A performs an endoscopic examination of a subject (patient), and also transmits image data P1 acquired during the examination to an in-hospital system, a server, etc. Control is performed to transmit data to the auxiliary device 30 located there.
- the imaging unit 12A is provided at the distal end of the endoscope system 10A that is inserted into the body, and includes an optical lens, an image sensor, an imaging circuit, an image processing circuit, and the like.
- the imaging unit 12A is assumed to be composed of a small-sized imaging device and an imaging optical system that forms an image of the object on the imaging device, and specifications such as the focus position and the focal length of the optical lens are determined. Further, the imaging unit 12A may be provided with an autofocus function or an expanded depth of field function (EDOF function), and in this case, it is possible to determine the distance to the object, the size of the object, and the like. If the imaging unit 12A has an angle of view of approximately 140 degrees to 170 degrees, it is possible to photograph over a wide range.
- EEOF function expanded depth of field function
- the imaging optical system may include a zoom lens.
- the imaging unit 12A acquires image data of a moving image at predetermined time intervals determined by the frame rate, performs image processing on this image data, and then records it in the recording unit 16A. Furthermore, when the release button in the operating section 17A is operated, the imaging section 12A acquires still image data, and this still image data is recorded in the recording section 16A.
- the imaging unit 12A functions as an imaging unit that acquires images of the subject's organs in time series (for example, see S1 in FIG. 3).
- the image P1 is an image acquired by the imaging unit 12A, and is transmitted to the input unit 32 of the auxiliary device 30 through the communication unit 21A.
- Image P1 is a time series image
- image P11 is an image acquired immediately after inserting the tip of endoscope system 10A into the oral cavity
- image P20 is acquired immediately before removing endoscope system 10A from the oral cavity. It is an image.
- Images P11 to P16 are consecutive images belonging to the operation unit.
- images P15 to P19 are also images belonging to another operation unit.
- the unit of operation is the number of steps required to change the image pattern by changing the insertion direction, rotating the tip, bending the tip, etc., until the specialist reaches the target area such as the affected area. This is a series of images.
- images P11 to P16 are the first unit of operation
- images P15 to P19 are the second unit of operation.
- images P15 and P16 overlap in the first and second operation units.
- images do not need to overlap between the two operation units, and images between the two operation units do not need to belong to the operation unit (in the latter case, the operation is not performed).
- the light source section 13A includes a light source, a light source control section, and the like.
- the light source section 13A illuminates the object with appropriate brightness.
- a light source is placed at the distal end of the endoscope system 10A to illuminate the inside of the body, such as an affected area, and a light source control unit controls the illumination by the light source.
- a light source control unit controls the illumination by the light source.
- the display unit 14A displays an image inside the body based on the image data acquired by the imaging unit 12A. Further, the display unit 14A can display an operation guide superimposed on the inspection image. For example, a display indicating the vicinity of the site (affected area) is made. This operation guide may be displayed based on the inference result by the inference engine 18A. Furthermore, a menu screen for operating and displaying the endoscope system 10A can also be displayed.
- the ID management unit 15A performs ID management for identifying a subject (patient) when a specialist performs an examination using the endoscope system 10A. For example, a specialist may input the ID of the subject (patient) through the operation unit 17A of the endoscope system 10A. Further, the ID management unit 15A may associate an ID with the image data acquired by the imaging unit 12A.
- the recording unit 16A has an electrically rewritable nonvolatile memory, and records adjustment values for operating the endoscope system 10A, programs used in the control unit 11A, and the like. It also records image data acquired by the imaging unit 12A.
- the operation unit 17A is an operation unit (also referred to as an interface) for bending the distal end of the endoscope system 10A in an arbitrary direction, a light source operation unit, an operation unit for image capturing, a treatment instrument, etc. It has various operation parts such as an operation part.
- the ID of the subject (patient) may be input through the operation unit 17A.
- the inference model is placed within the inference engine 18A.
- This inference model can be used in various ways, such as an inference model that infers possible diseased areas such as tumors or polyps in images acquired by the imaging unit 12A, and an operation guide for operating the endoscope system 10A. It may be composed of an inference model.
- the inference engine 18A may be configured by hardware, software (program), or a combination of hardware and software.
- the clock section 20A has a calendar function and a timekeeping function.
- image data is acquired by the imaging unit 12A, the acquisition date and time may be output, or the elapsed time from the start of the examination may be output.
- this time information may also be recorded.
- this time information may be associated with the output.
- time information etc. output from the clock section 20A may be associated with the image data.
- the communication unit 21A has a communication circuit (including a transmission circuit and a reception circuit), and exchanges information with the auxiliary device 30. That is, the image data acquired by the imaging unit 12A is transmitted to the auxiliary device 30.
- the communication unit 21A may communicate information with the second endoscope 30 in addition to the auxiliary device 30.
- the communication unit 21A may communicate with other servers and in-hospital systems, and in this case, it can collect information from and provide information from other servers and in-hospital systems. Alternatively, an inference model generated by an external learning device may be received.
- the auxiliary device 30 is installed in an in-hospital system, a server, or the like.
- the in-hospital system is connected to devices such as endoscopes, personal computers (PCs), mobile devices such as smartphones, etc. in one or more hospitals through wired or wireless communication.
- the server is connected to equipment such as endoscopes, in-hospital systems, etc. through a communication network such as the Internet or an intranet.
- the endoscope system 10A may be connected to an auxiliary device 30 in a hospital system, directly connected to an auxiliary device 30 in a server, or connected to an auxiliary device 30 through an in-hospital system.
- the auxiliary device 30 includes a control section 31, an input section 32, an ID management section 33, a communication section 34, a recording section 35, an inference engine 37 in which an inference model is set, and an operation unit determination section 37.
- a control section 31 an input section 32, an ID management section 33, a communication section 34, a recording section 35, an inference engine 37 in which an inference model is set, and an operation unit determination section 37.
- each of the above-mentioned parts may be provided in an integrated device, but may also be distributed and arranged in a plurality of devices. Furthermore, each part may be connected through a communication network such as the Internet or an intranet.
- the control unit 31 is composed of one or more processors having a processing device such as a CPU (Central Processing Unit), a memory storing a program (the program may be recorded in the recording unit 35), etc., and executes the program. and controls each part within the auxiliary device 30.
- the control unit 31 allows a non-specialist to test a subject (patient) using the endoscope system 10A, and then a non-specialist to test the same subject (patient) using the second endoscope system 10B.
- the auxiliary device 30 is configured to output an operation guide for finding the affected area of the subject (patient). Performs overall control.
- the CPU of the control unit 31 of the auxiliary device 30 executes the program in cooperation with the CPU of the control unit 11A, and realizes the flow operation shown in FIG. 3.
- a CPU in a processor and a program stored in a memory implement functions such as an operation unit determination section.
- the input unit 32 has an input circuit (communication circuit), and inputs the input image P1 acquired by the imaging unit 12A. With respect to the image P1 input by the input unit 32, the operation unit determination unit 37 determines the image group of the operation unit. This group of images is output to the inference engine 37, and the inference engine 37 uses the inference model to infer operation information for reaching the position of a target region such as an affected area, and outputs operation information Iop.
- the operation information Iop includes operation information for operating the operation unit, an endoscopic image at this time, and the like. Note that in this embodiment, the operation information Iop is output by inference using an inference model, but the operation information Iop may be output based on image similarity determination.
- the input unit 32 functions as an input unit that inputs images of the subject's organs in chronological order (for example, see S1 in FIG. 3).
- the ID management unit 33 manages the ID of the subject (patient). As mentioned above, when a specialist performs an examination using the endoscope system 10A, the ID of the subject (patient) is input, and the image P1 associated with this ID is displayed in the endoscope system. It is transmitted from 10A. The ID management unit 33 associates the ID associated with this image P1 with ID information of the subject (patient) recorded in the recording unit 35 or the like. Further, when a non-specialist performs an examination or the like using the second endoscope system 10B, necessary operation information Iop is output based on the ID information.
- the communication unit 34 has a communication circuit and exchanges information with the endoscope system 10A and the second endoscope system 10B. Further, the communication unit 34 may communicate with other servers and in-hospital systems, and in this case, it can collect information from other servers and in-hospital systems, and can also provide information.
- the operation information Iop generated in the inference section 36 is transmitted to the second endoscope system 10B through the communication section 34. In this case, operation information Iop corresponding to the ID of the subject to be examined using the second endoscope system 10B is transmitted to the communication unit 21B of the second endoscope system 10B through the communication unit 34.
- Ru The communication unit 34 functions as an output unit that outputs the operation unit information recorded in the recording unit (for example, see S23 in FIG. 3).
- the recording unit 35 has an electrically rewritable non-volatile memory, and stores image data that the input unit 32 inputs from the imaging unit 12A, information such as the examinee's (patient) profile, examination history, examination results, etc. Programs and the like used in the control unit 31 can be recorded. Further, when the subject (patient) is examined using the endoscope system 10A (which may include the second endoscope system 10B), the recording unit 35 stores image data based on the image P1 at that time. The operation information Iop inferred and outputted by the inference engine 37 may also be recorded.
- the recording unit 35 records the inspection image 35a and operation unit information 35b. As described above, when a subject (patient) is examined using the endoscope system 10A, the recording unit 35 records image data based on the image P1 at that time. This image data is recorded as an inspection image 35a.
- the operation unit information 35b is recorded for each ID of a subject (patient) who undergoes an examination (including diagnosis and treatment) using the endoscope system 10A. In this case, since one subject may undergo multiple tests, it is preferable to distinguish them by the date and time of the test. Furthermore, as explained using FIG. 7, since there are multiple operation units in one examination, etc., the operation unit information 35b includes a start image 35ba, an end image 35bb, and operation information 35bc for each operation unit. , records time information 35bd.
- the operation unit information 35b records a start image 35ba, an end image 35bb, operation information 35bc, and time information 35bd.
- the start image 35ba is the first image belonging to the operation unit as a result of the determination by the operation unit determination section 37.
- image P12 is the start image belonging to the first operation unit
- image P15 is the start image belonging to the next operation unit.
- the end image 35bb is the last image belonging to the operation unit as a result of the determination by the operation unit determination section 37.
- image P16 is the end image belonging to the last operation unit
- image P19 is the end image belonging to the next operation unit.
- the image P11 is an image when the endoscope is inserted
- the image P20 is an image when the endoscope is pulled out.
- the operation information 35bc is information regarding the operation state of the endoscope system 10A, and the operation information is recorded for each image data and/or operation unit.
- the operation information may be acquired based on a change in the image acquired by the imaging unit 12A.
- the image changes depending on the operation. .
- the image also changes when a water injection operation, suction operation, etc. is performed.
- control unit 31 and the like acquire operation information and record it as operation information 35bc.
- operation information 35bc In addition to acquiring operation information based on images, for example, if operation information performed by the operation unit 17A in the endoscope system 10A is transmitted to the auxiliary device 30 in association with image data, This associated operation information may be acquired.
- the time information 35bd is time information for each individual image in the unit of operation.
- the time information may be information indicating what year, month, day, hour, minute, and second the image was acquired.
- the start of the operation may be set as a reference time, and the time elapsed from this reference time may be used as time information.
- operation unit information 35b an object that serves as a mark of the target part is determined in the vicinity of the target part such as an affected part (see mark Ob in Fig. 2), and an image of this mark Ob (including position information) is determined. (see S17 in FIG. 3). Furthermore, an image of the target region Tg (which may include positional information) is also recorded as operation unit information 35b. Further, information on the operations performed by the specialist from finding the landmark to reaching the target is also recorded in the recording unit 35 as operation unit information 35b (see S19 in FIG. 3).
- the recording unit 35 functions as a recording unit that records, for each operation unit determined by the operation unit determination unit, information regarding the image and endoscope operation in this operation unit as operation unit information (for example, see S11 in FIG. 3). ).
- the recording unit records a start image and an end image among the continuous images belonging to the operation unit, and also records operation information indicating the operation state in the operation unit (for example, see S11 in FIG. 3).
- the recording unit records operation information after finding a landmark near the target (for example, see S17 and S19 in FIG. 3).
- the operation unit determination section 36 performs a determination for dividing the images into operation units for the images inputted in chronological order by the input section 32 (for example, see S7, S11, etc. in FIG. 3). That is, it is determined based on the image, etc. whether the image is a case where the same operation/movement, etc. is continued. For example, suppose that a medical specialist linearly advances the distal end of an endoscope, moves forward while bending the distal end at a certain timing, and then moves the endoscope forward again in a straight line after a while. In this case, an image during a forward operation until the bending operation is performed becomes one operation unit, and then an image after the bending operation is performed until the object moves linearly forward again becomes one operation unit.
- the specialist's operation is not limited to just one, and may be performed in multiple ways. For example, there are cases where a bending operation or a rotation operation is performed while moving forward. There are times when it is better to distinguish such complex operations from simple operations, and there are times when it is better to distinguish them separately. It can be determined according to the
- the operation unit determination unit 36 determines the direction of the operation for the image acquired by the imaging unit based on the asymmetry of the anatomical structure (for example, see S13, S15, etc. in FIG. 3). As described above, it is not easy to express the direction in which the distal end of the endoscope faces, such as anterior, posterior, rightward, and leftward within the body cavity (see, for example, FIG. 5). Therefore, in this embodiment, the direction of operation is determined based on the asymmetry of the anatomical structure.
- the determination of the operation unit may be performed not only based on the image but also based on information such as operation information attached to image data, or may be determined based on information such as the image and operation information. You may also do so.
- a sensor or the like may be provided in the distal end portion and/or the insertion portion of the endoscope, and/or the operation portion, and operation information may be acquired based on the output from the sensor. If a sensor is provided in the so-called flexible tube part, the shape of the scope can be recognized, and as a result, it is possible to more accurately grasp situations such as pressing on the greater curvature of the stomach.
- a sensor may be mounted on the operation section.
- a transmission source may be provided at the distal end of the endoscope, a sensor may be provided outside the body to detect a signal from the transmission source, and operation information may be acquired based on the output from this sensor.
- the operation unit information determined by the operation unit determination section 36 is output to the inference engine 37.
- the operation unit determination unit 36 may include a hardware circuit for making the above-described determination, or may implement the above-described determination using software. Further, the control section 31 may also have this function. In other words, the determination may be made by the hardware circuit of the control unit 31 and/or software by the CPU. Further, the operation unit determination unit 36 may include an inference model and determine the operation unit by inference.
- the operation unit determination unit 36 functions as an operation unit determination unit that divides images of organs acquired in time series into operation units and determines the operation performed for each operation unit (for example, see S7, S11, etc. in FIG. 3). ).
- the operation unit determination section determines whether or not the operation unit is operated based on whether at least one of the insertion direction, rotation direction, and bending direction of the distal end of the first endoscope has changed based on the image acquired by the imaging section. Divide into units (for example, see S7 in FIG. 3 and FIG. 7).
- the operation unit determination unit determines the direction of the operation in the image acquired by the imaging unit based on the asymmetry of the anatomical structure (for example, see S13 in FIG. 3, P5A, P5B in FIG. 5, etc.).
- the inference engine 37 may be configured by hardware, software (program), or a combination of hardware and software. An inference model is set in this inference engine 37. In this embodiment, the inference engine 37 is provided in the auxiliary device 30, but it may also be provided in a device such as an endoscope and perform inference within the device.
- the inference engine 37 equipped with the inference model When the image data of the image P1 is input to the input layer of the inference engine 37, the inference engine 37 equipped with the inference model performs inference and outputs operation information Iop related to endoscope operation from the output layer.
- This operation information Iop is an operation guide (when a non-specialist inserts the second endoscope system 10B into a body cavity) to perform operations equivalent to those performed by a specialist to reach a target site such as an affected area.
- This is information for displaying operational advice). That is, it includes an image of each operation acquired by the specialist and information indicating the operation state of the operation unit at that time. Note that it is not necessary to include all images and information indicating the operation status, as long as there are images and information that are the key points of the operation.
- the inference engine 37 uses time-series images (containing operation information in FIG. 1A) obtained from an examination performed by a specialist using the endoscope system 10A, and uses the time-series images (containing operation information in FIG. An inference model for displaying an operation guide may be generated.
- training data based on a large number of time-series images is input to the input layer of the inference engine 37.
- FIG. 1A exemplarily shows image groups P2 and P3, but many other image groups are input.
- image P22 is the start image belonging to the first operation unit
- image P25 is the last image belonging to this operation unit
- image P26 is the next operation unit.
- the image P29 is the first image belonging to this operation unit
- image P29 is the last image belonging to this operation unit.
- image P21 is an image at the time of insertion among a series of time-series images
- image P30 is an image at the time of extraction.
- image P32 is the start image belonging to the first operation unit
- image P35 is the last image belonging to this operation unit
- image P36 belongs to the next operation unit. This is the first image
- image P39 is the last image belonging to this operation unit.
- the image P31 is an image at the time of insertion
- the image P40 is an image at the time of extraction out of a series of time-series images.
- operation information is added, information Isa indicates that the images are the same, and information Idi indicates that the images are different.
- operation information is added to the image group P3, and the information Isa indicates that the images are the same, and the information Idi indicates that the images are different.
- An inference model for operation guidance can be generated by using a large number of images such as image groups P2 and P3 as training data and performing machine learning such as deep learning using this training data.
- Deep learning is a multilayered version of the "machine learning” process that uses neural networks.
- a typical example is a forward propagation neural network, which sends information from front to back to make decisions.
- the simplest version of a forward propagation neural network consists of an input layer consisting of N1 neurons, a middle layer consisting of N2 neurons given by parameters, and N3 neurons corresponding to the number of classes to be discriminated. It is sufficient to have three output layers consisting of neurons. Each neuron in the input layer and the intermediate layer, and the intermediate layer and the output layer, are connected by connection weights, and a bias value is added to the intermediate layer and the output layer, thereby easily forming a logic gate.
- a neural network may have three layers if it performs simple discrimination, but by having a large number of intermediate layers, it is also possible to learn how to combine multiple features in the process of machine learning. In recent years, systems with 9 to 152 layers have become practical in terms of learning time, judgment accuracy, and energy consumption.
- a "convolutional neural network” that performs a process called “convolution” that compresses image features, operates with minimal processing, and is strong in pattern recognition may be used.
- a “recurrent neural network” (fully connected recurrent neural network) that can handle more complex information and that allows information to flow in both directions may be used to support information analysis whose meaning changes depending on order and order.
- NPUs neural network processing units
- AI artificial intelligence
- Machine learning methods include methods such as support vector machine and support vector regression.
- the learning here involves calculating the weights, filter coefficients, and offsets of the classifier, and in addition to this, there is also a method that uses logistic regression processing.
- a machine makes a decision
- humans need to teach the machine how to make a decision.
- a method of deriving the image judgment by machine learning is adopted, but a rule-based method that applies rules acquired by humans using empirical rules and heuristics may also be used.
- the second endoscope system 10B shown in FIG. This is an endoscope that is used by non-specialists when undergoing examinations.
- This second endoscope system 10B may be the same model as the endoscope system 10A, or may be completely the same device, but in this embodiment, it is shown as a different model of endoscope.
- the second endoscope system 10B provides information on the subject (including the patient) who has undergone organ examination (including diagnosis and treatment) using the first endoscope system. It functions as a second endoscope system for observing organs to be observed.
- the auxiliary device 30 outputs an operation auxiliary image group P4 to the second endoscope system 10B for providing operation guidance at the time of re-examination by a non-specialist.
- the operation auxiliary image group P4 at the time of re-examination is the image P4 from when the second endoscope system 10B is inserted into the body cavity to the image P43 corresponding to the position of the target site such as the affected area when re-examining using the second endoscope system 10B. These are chronological images.
- the operation auxiliary image group P4 for reexamination may be created based on images P11 to P20, etc. of the images P1 acquired during the first examination.
- the image P43 in the operation auxiliary image group P4 at the time of reexamination includes operation information Iop that is the result of inference by the inference engine 36, and may display a guide such as "Do this operation".
- a landmark may be placed in front of it.
- the image serving as Ob (the image at position L3 in FIG. 2) may be displayed.
- a specification may be adopted in which the robot temporarily stops in front of the landmark and provides more detailed guidance on how to access the target site from that position.
- the example shown in FIG. 2 is a case in which an easy-to-understand location (position L3) is set as a landmark (for example, the pylorus) and viewed by bending from there (target), and image P43 corresponds to the target site Tg.
- both the image of the landmark Ob and the image P43 corresponding to the goal Tg may be displayed, or only the image P43 of the target region may be displayed.
- the second endoscope system 10B includes a control section 11B, an imaging section 12B, a light source section 13B, a display section 14B, an ID management section 15B, a recording section 16B, and an operation section 17B. These are the same as the control unit 11A, imaging unit 12A, light source unit 13A, display unit 14A, ID management unit 15A, recording unit 16A, and operation unit 17A of the endoscope system 10A, so the second endoscope system Additional configurations and functions provided as 10B will be supplementarily described, and detailed explanations will be omitted.
- the control unit 11B is composed of one or more processors having a processing device such as a CPU (Central Processing Unit), a memory storing a program (the program may be stored in the recording unit 16B), etc., and executes the program. and controls each part within the second endoscope system 10B.
- the control unit 11B performs various controls when the endoscope system 10B reexamines the subject (patient).
- the CPU of the control unit 11B executes the program stored in the recording unit 16B, etc., and realizes the operation of the flow shown in FIG.
- the CPU in the processor and the program stored in the memory implement the functions of the acquisition section, operation determination section, operation guide section, and the like.
- control unit 11B uses the images obtained by the imaging unit 12B at the time of reexamination, the operation assistance image group P4 at the time of reexamination outputted from the auxiliary device 30, etc., in order to reach the target area such as the affected area.
- the guide unit 19B is caused to execute the operation guide.
- the guide unit 19B in which the inference model is set is used. Inference may be performed, or similar image determination may be performed by a similar image determination unit 23B, which will be described later.
- the operation guide created by the control unit 11B may be displayed on the display unit 14B, and the fact that the distal end of the endoscope is near the object or target region may be displayed on the display unit 14B.
- the imaging unit 12B is the same as the imaging unit 12A, so a detailed explanation will be omitted, but the imaging unit 12B functions as an imaging unit that acquires images of the subject's organs in chronological order. (For example, see S33 in FIG. 4).
- the communication unit 21B has a communication circuit (including a transmitting circuit and a receiving circuit), and exchanges information with the auxiliary device 30.
- the operation information Iop output from the auxiliary device 30 is received.
- the operation information Iop includes a start image, an end image, operation information, and time information for each operation unit (these are recorded in the recording unit 35 as operation unit information 35b).
- the operation information Iop includes a target region image (P43), and may also include a landmark image. If the specialist uses the first endoscope 10A to perform a re-examination of the organ that the specialist examined on the same person as the subject (patient) who performed the examination, etc., The ID of this subject (patient), etc.
- the operation information Iop may be only the necessary information of the operation unit information 35b.
- the image data acquired by the imaging section 12B may be transmitted to the auxiliary device 30.
- the communication unit 21B may communicate information with the endoscope system 10A other than the auxiliary device 30. Furthermore, the communication unit 21B may communicate with other servers and in-hospital systems, and in this case, it can collect information from other servers and in-hospital systems, and can also provide information. Alternatively, an inference model generated by an external learning device may be received.
- the communication unit 21B functions as an acquisition unit that acquires time-series operation content information in the first endoscope system as operation unit information (for example, see S31 in FIG. 4).
- the above-mentioned operation unit information is image change information estimated using the asymmetry of the observed organ (for example, see FIG. 7).
- the above-mentioned operation unit information is image change information indicating a succession of the same operations (for example, see image P1 in FIG. 1A and images P6a to P6f in FIG. 7).
- the asymmetry information of the organ to be observed is determined based on the anatomical positional relationship of a plurality of parts within the specific organ (see, for example, FIG. 7).
- the communication unit 21B functions as an input unit for inputting the recorded operation unit information for a subject who has undergone an examination using the first endoscope system (for example, see S31 in FIG. 4). .
- the signal output unit 22B outputs a signal indicating that when the distal end of the second endoscope system 10B reaches the vicinity of a target site such as an object or an affected area. For example, by irradiating a light source with the light source section 13B, the irradiated light may be visible from the outside of the gastrointestinal wall, thereby informing a doctor or the like of its position.
- the similar image determination unit 23B compares the image data acquired by the imaging unit 12B with the operation assistance image group P4 at the time of reexamination, and determines the degree of similarity.
- the operation auxiliary image group P4 at the time of reexamination includes a start image, an end image, etc. for each operation unit, so these images are compared with the current endoscopic image acquired by the imaging unit 12B. , determine whether the images are similar.
- There are various methods for determining the similarity of images such as a pattern matching method, and from among these methods, a method suitable for this embodiment may be used as appropriate.
- the similar image determination unit 23B determines whether each image of the operation auxiliary image group P4 and the image acquired by the imaging unit 12B are similar. Determine whether or not. In making this determination, since the operation auxiliary image group P4 is divided into operation units, the similar image determination unit 23B determines which operation unit the currently acquired image group is similar to. If the image acquired by the imaging unit 12B is similar to the end of the operation unit, the guide unit 19B displays the operation information on the display unit 14B based on the operation information Iop.
- the similar image determination unit 23B determines the operation process of the second endoscope system 10B by detecting changes in the endoscopic image pattern. Based on the operation unit information Iop, etc., the continuous images acquired by the imaging unit 12B are divided into operation units, and the operation process (such as insertion operation, rotation operation, bending operation, etc.) currently being performed for each operation unit is (see FIG. 7).
- the similar image determination unit 23B functions as an insertion operation determination unit that estimates the operation process when a subject undergoes an examination (including diagnosis and treatment) using the second endoscope system (see Fig. (See S37 of 4).
- the similar image determination unit 23B finds an image with a high degree of similarity to the image P43 indicating the target object (see landmark Ob in FIG. 2), it determines that a target region such as an affected area (target region Tg in FIG. 2) is located near this position. Therefore, the display unit 14B displays that the target area, such as the affected area, is near.
- a doctor can search for a target area such as an affected area by carefully examining the area in accordance with the operation information. If it cannot be found immediately, air may be supplied, or the endoscope may be pulled out a little to create a space for observation. If a target site such as an affected area is found, progress observation of the target site such as an affected area can be performed. Further, depending on the condition of the target site such as the affected area, treatment such as surgery may be necessary.
- similar image determination may be performed by inference instead of being determined by the similar image determination unit 23B based on the degree of similarity of images. That is, an inference engine may be provided in the similar image determination unit 23B, an inference model for similar image determination may be set in this inference engine, and similarity may be determined by inference.
- the inference engine functions as a similar image estimation unit having a similarity estimation model that estimates the similarity of images based on images of endoscopy. Even when a non-specialist operates the endoscope, it is possible to guide the endoscope to the vicinity of the target region, such as an affected region, by using the determination result of the similar image determination section 23B.
- the guide unit 24B provides operation guidance (which may also be called operation advice) to a non-specialist who uses the second endoscope system 10B, based on the determination result by the similar image determination unit 23B. That is, the guide unit 24B divides the time-series images acquired by the imaging unit 12B into operation units using the determination result of the similar image determination unit 23B, and divides the time-series images acquired by the imaging unit 12B into operation units, and divides the currently acquired operation information and the operation information included in the operation unit information.
- the successive images may be compared and the quality of the operation may be displayed based on the comparison result. In other words, it guides the user so that the operation is equivalent to that performed by a specialist, so that organs such as affected areas can be observed under the same observation conditions as those observed by the specialist.
- the guide section 24B may perform event determination and display a corresponding display. For example, in a certain operation unit, a rotation operation, a bending operation, or a guiding operation such as a water injection operation or an air supply operation may be performed.
- the guide unit 24 may display an operation guide for proceeding to the next operation unit at the timing of switching between operation units.
- This guide display may be displayed superimposed on the endoscopic image displayed on the display section 14B, or may be displayed by the display section 14B using an audio guide or the like. That is, the display unit 14B may not only visually display the guide, but may also convey the guide information to the non-specialist by voice or the like.
- the guide section 24B is configured to control the observation target in the same manner as when a specialist performs an examination.
- Guide information is output so that characteristic parts of organs can be observed under the same observation conditions as the first endoscope system 10A (for example, see S37 and S47 in FIG. 4).
- similar observation conditions include the size of the object photographed within the screen, the angle of view, etc., and the positional relationship between the imaging unit and the observation object when observing the observation object is the same. This is the condition for making it.
- the guide unit 24B compares the operation process estimated by the insertion operation determination unit with the operation unit information, and compares the operation process estimated by the insertion operation determination unit with the operation unit information, and connects the second endoscope system to the second endoscope system in order to observe the characteristic site in the observation target organ with the second endoscope system. It functions as an operation guide unit that outputs operation guide information for operating the (for example, see S37 and S39 in FIG. 4).
- the operation guide information output by the operation guide unit is guide information for observing the characteristic parts of the observation target organ under the same observation conditions as the first endoscope system (for example, S37 and S39 in FIG. 4). reference).
- the operation guide information for operating the second endoscope system to observe the characteristic part of the observation target organ is determined based on the temporal information when comparing the operation process estimated by the insertion operation determination unit with the operation unit information. A plurality of pieces of operation unit information adjacent to each other are compared, and if the overlapping part does not require follow-up observation during observation, the operation unit information is corrected and compared with the operation unit information excluding the operation of this overlapping part.
- the similar image determination unit 23B and the guide unit 24B divide the images acquired in time series into operation units, estimate the operation state of the second endoscope system for each operation unit, and estimate the operation state of the second endoscope system for each operation unit. It functions as an operation guide unit that compares the state and the operation unit information and outputs guide information for observation under the same observation conditions as the first endoscope system (for example, see S37 and S39 in FIG. 4).
- the tip of the second endoscope system 10B A guide display is displayed instructing the user to bend or rotate the part and proceed to route R2. Furthermore, when the vehicle approaches the landmark Ob while traveling along the route R3, a display indicating that the target region Tg such as the affected area is approaching is displayed.
- the specialist uses the endoscope system 10A to memorize the operation to be performed when reaching the target site Tg such as an affected area, and provides operational guidance for reaching this target site Tg. Even a specialist can operate the second endoscope system 10B, easily reach the target site Tg, and perform observation, treatment, etc.
- a non-specialist uses the second endoscope system 10B.
- the second endoscope system is used so that the endoscopist can observe the observation target area under the same observation conditions as when using the first endoscope system 10A. It is designed to be able to guide 10B. There are the following parts of the body to be observed that require such follow-up observation.
- the following auxiliary information can be acquired as clues for the examination, and the examination can be performed by using these auxiliary information.
- ⁇ Information for viewing the follow-up observation area such as the size, shape, unevenness, and color of the lesion.
- Imaging environment such as the following: Whether or not pigments such as indigo carmine or stains such as methylene blue are used; The use of observation light such as WLI (White Light Imaging) and NBI (Narrow Band Imaging), the presence or absence of image processing settings such as structure enhancement, air supply volume, and the patient's Body position, equipment information such as the type of video processor and scope, the condition of the mucous membrane around the lesion (is there anything confusing?), the distance between the lesion and the scope, the viewing angle, and the amount of insertion, amount of twisting, angle, and degree of bending of the scope.
- Information on imaging and past findings such as information on the degree of gag reflex and information on the time of day, such as time from the start of the test, timing of the test, etc.
- AI Artificial Intelligence
- ⁇ Detection AI for detecting areas that require follow-up observation such as AI for site recognition, CADe (Computer Aided Detection: Lesion Detection Support) using images, CADx (Computer Aided Diagnosis: Lesion Differentiation Support), AI that detects treated areas. Note that it is also possible to substitute information written in the electronic medical record.
- ⁇ AI to recognize the characteristics of areas that require follow-up observation such as AI to detect the size, shape, response, and color of the lesion, and AI to detect the condition of the surrounding mucous membranes.
- AI Artificial Intelligence
- - AI for recognizing the observation environment for example, AI for detecting whether a dye such as indigo is used from an image.
- - AI for estimating air supply amount for example, AI that estimates based on air pressure sensor output, AI that estimates based on cumulative air supply time, and AI that estimates based on images.
- ⁇ AI for estimating the distance to the lesion for example, the insertion amount of the endoscope tip, the degree of twisting, angle, bending of the endoscope tip, etc., is used to estimate the distance between the lesion and the endoscope tip.
- the operation of the endoscope 1 is realized by the cooperation of the control unit 11A in the endoscope system 10A and the control unit 31 in the auxiliary device 30. Specifically, the CPU provided in each control unit This is realized by controlling each part in the endoscope system 10A and the auxiliary device 30 according to a program stored in the memory.
- imaging is first started (S1).
- the imaging device in the imaging unit 12A acquires time-series image data at time intervals determined by the frame rate.
- image data inside the body cavity is acquired, and this image data is subjected to image processing by an image processing circuit in the imaging section 12A.
- the display unit 14A displays an image of the inside of the body cavity using image data that has undergone image processing.
- the specialist operates the endoscope system 10A while viewing this image, and moves the distal end toward the position of a target site such as an affected area.
- the image data subjected to image processing is transmitted to the input section 32 in the auxiliary device 30 through the communication section 21A. In this step, it can be said that images of the subject's organs are acquired in time series by the imaging unit.
- the operation unit determination unit 36 in the auxiliary device 30 determines whether the image of the inner wall surface in the body cavity acquired by the endoscope system 10A has changed.
- the operation unit determination section 36 makes a determination based on a change in the image.
- this determination is not limited to the operation unit determination unit 36, and may be performed by other blocks such as the control unit 31.
- an inference model for determining changes in the inner wall surface may be set in the inference engine 37, and the inference engine 37 may determine changes in the image of the inner wall surface. As a result of this determination, if there is no change in the image of the inner wall surface, a standby state is entered until a change occurs.
- step S5 if the image of the inner wall surface has changed, the image is temporarily recorded (S5).
- the image data input through the input section 32 is temporarily recorded in the recording section 35 as an inspection image 35a.
- the memory is not limited to the recording unit 35 and may be any memory that can temporarily record image data.
- the image change pattern has changed due to the insertion direction, rotation, tip bending, etc. (S7).
- the determination here is that the image of the inner wall surface has changed as a result of the determination in step S3, and the cause of this change is the endoscope operation by the specialist, for example, the insertion direction of the endoscope tip. It is determined whether the operation is a change, a rotation operation of the tip, a bending operation of the tip, etc. In other words, it is determined whether the change in the image change pattern is not simply due to a change in the part of the organ being observed, but is due to an operation by a specialist.
- the change in this image change pattern is determined by the operation unit determination section 36 based on the image input through the input section 32. For example, in FIG. 2, when traveling straight along route R1, the direction of the tip curves at position L1 and the image change pattern changes at this position L1. .
- the change in the image change pattern may be a case where the image pattern simply changes (for example, the image pattern changes from a circle to a square), or a case where the way the image pattern changes changes. In any case, it may be determined whether the image has changed due to an operation by a specialist or the like.
- operation information associated with image data and sensor output from a sensor provided at the distal end of the endoscope may be used instead of images.
- the determination may be made based on information such as operation information attached to the image data, or may be determined based on the image and information such as operation information.
- a sensor or the like may be provided at the distal end of the endoscope, and operation information may be acquired based on the output from this sensor or the like.
- a source may be provided at the tip of the endoscope, a sensor may be provided outside the body to detect signals from the source, and operation information may be obtained based on the output from this sensor.
- this determination may be made using an inference model set in the inference engine 37 (or the inference engine 18A) in addition to the operation determination unit 36.
- step S9 if the image pattern has not changed due to the insertion direction, rotation, or tip bending, other events are executed (S9).
- Other events include various events performed by medical specialists, such as air supply operations, water injection operations, suction operations, and still image photography.
- the specialist executes other events, the location and type of the event are recorded. This recorded event is displayed when a non-specialist performs an examination or the like (see S39 in FIG. 4).
- Other events include operations and processing other than the insertion direction, rotation (vertical relationship), and tip bending, such as the use of treatment instruments, changes in shooting parameters such as exposure and focus, HDR (High Dynamic Range), and depth Image processing including compositing, switching light sources such as special light observation, image processing that emphasizes specific structures, operations and processing to discover objects by adding some effort, such as dye scattering and staining, etc. It is.
- the information that you have put in the effort becomes a useful guide.
- the observation guide there may be a guide that instructs to remove a polyp when it is found. In order to realize this guide, it may be possible to appropriately select how far the record of the first endoscope system is used for the guide.
- step S7 if the image change pattern has changed due to the insertion direction, rotation, tip bending, etc., the operation content information is recorded in chronological order with the continuous part of the image change pattern as an "operation unit”. , an operation starting point, and an end point image are recorded (S11).
- an operation unit a series of images from the image for which it is determined that the image change pattern has changed until the next image for which it is determined that the image pattern has changed.
- the image data is recorded in the recording section 35 in chronological order.
- the image that is the starting point of the operation unit in this series of images is recorded as a start image 35ba, and the last image in the series of images is recorded as an end image 35bb.
- step S11 the operation unit determination unit 36 performs image analysis on a series of images to obtain operation information, extracts operation information attached to the images, and records these as operation information 35bc. Further, the operation unit determination unit 36 extracts time information from the time information of the first and last images for the images included in the operation unit, and records it as time information 35bd. Note that these pieces of information may not be recorded all at once, but may be extracted and recorded as appropriate while repeatedly performing steps S3 ⁇ S21 ⁇ S3.
- Step S11 can be said to be a determination step that divides the images of the organs acquired in chronological order into operation units, and determines the operation performed by the first endoscope for each operation unit. Further, step S11 can also be said to be a recording step of recording, for each determined operation unit, the image and information regarding the endoscope operation in this operation unit in the recording section as operation unit information.
- the operation unit determination unit 36 analyzes the image input by the input unit 32 and determines whether there is an image change based on the asymmetry of the anatomical structure. For example, in FIG. 7, from time T3 to time T4, the protrusion of the cavity changes from the 1 o'clock direction in the upper right corner to the 12 o'clock direction. In this way, image changes can be detected by utilizing the presence of asymmetry in organs.
- step S13 if there is a change in the image, it is determined that there has been a change in the direction of the tip, and the change in the operating direction is recorded (S15). Since the operating direction has been changed, this fact is recorded in the recording section 35.
- the direction of the tip of the endoscope can be determined using the asymmetry of the anatomical structure, and since there was a change in the image in step S13, it can be said that the direction of the tip has changed.
- the changed operation direction is recorded in the recording section 35. Note that when the operation direction changes, the next series of images may be recorded as an "operation unit.”
- a landmark is, for example, a landmark Ob in FIG. 2, an object that is located near the target site Tg such as an affected area and serves as a landmark when searching for the target site Tg. Since the image information and/or position information of this landmark Ob is included in the operation information output from the auxiliary device 30, the guide section 24B uses the image information and/or position information of the landmark Ob based on this information and the image acquired by the imaging section 12B. Determine whether or not it has been discovered.
- step S17 if a landmark is found, the landmark image is recorded and the operation before discovery is recorded (S19).
- a landmark image is recorded. Furthermore, since the target area such as the affected area is located near the landmark, the user searches for the target area in the vicinity and records the operations performed until the target area image is found. That is, the specialist records the operations performed from the landmark to the target site. If there is a record of this operation, even a non-specialist can easily reach the target site by operating the endoscope by referring to the operation record.
- the flow shown in FIG. 3 assumes that a method of accessing a target region using a landmark as a starting point is useful, and shows an example in which the process is performed in the order of landmark discovery and target region discovery.
- steps S17 and S19 may be omitted and the target region may be directly searched for.
- Whether or not to record it as a landmark may be determined by a specialist, or may be automatically recorded using AI or the like.
- the endoscope is pulled out or removed.
- it is difficult to pull out in which case the image of the end of the difficult place may be recorded in the operation unit information, and when the image of the end of the difficult place is detected, the target may be reset. .
- the specialist determines whether the target region has been found. Whether or not it is a target area may be determined by a specialist recording the fact, or by determining based on a specific operation such as taking a still image, or by AI based on an image or operation, etc. The determination may be made automatically. When a target region is found, an image of the target region may be recorded. As a result of this determination, for vias for which no target region has been found, the process returns to step S19.
- step S21 If the target region is found in step S20, or if no target object is found in step S17, it is then determined whether or not to end (S21). If the specialist finds the target site and completes the necessary recording, it may be determined that the process is complete. Furthermore, if there are multiple target parts, the process may be determined to be finished when the last target part is found. Alternatively, the process may be determined to have ended when all operations, such as pulling out the endoscope from the body cavity, have been completed. As a result of the determination in this step, if the process has not been completed, the process returns to step S3 and the above-described operation is executed.
- step S23 related data regarding the landmark, target region, etc. is transmitted (S23).
- the specialist since the specialist records operation information until reaching the target site, this information is transmitted to a terminal, server, etc. that requires the information. For example, if there is a request from the second endoscope system 10B to transmit related data regarding landmarks, target regions, etc., the corresponding data may be transmitted based on the ID of the designated subject, etc. . Further, the operation unit information 35b recorded in the recording section 35 may be transmitted to the outside all at once.
- This step S23 functions as an output step for outputting the operation unit information recorded in the recording section. After data transmission is performed in step S23, this flow ends.
- the target object exists near the target site, such as an affected area, and if the target object can be found, the target site can be easily found. Furthermore, if the pre-discovery operation at this time is recorded, it is possible to reach the target region more easily based on this operation record. That is, if a specialist records information on how to reach a target site such as an affected area, a non-specialist can use this information to easily reach the target site such as an affected area (see the flowchart in FIG. 4).
- operation unit information may be information that will be used by a non-specialist who uses the second endoscope system 10B to reach a target site such as an affected area, so it is limited to the information that is recorded in this flow. do not have.
- start image the starting point image
- an image change based on the asymmetry of the anatomical structure may be determined.
- time information such as elapsed time from the start of an examination or the like may be used as operation unit information.
- time information related to the end timing may be used as the operation unit information.
- an object having the character of a landmark
- it can encourage them to carefully observe it. It is also possible to record only the discovery of a target area such as an affected area. Further, in steps 7 and S11, recording was performed as a unit of operation based on the change in the image change pattern, and in steps S13 and S15, the distal direction was determined and recorded based on the asymmetry of the anatomical structure. . These processes may not be performed in separate steps, but may be performed all at once.
- the endoscope system 10A and the auxiliary device 30 perform processing in cooperation.
- the present invention is not limited to this, and the endoscope system 10A may independently record operation information until the specialist reaches the target site.
- the determination of the image change in step S3 is executed by the image processing circuit in the control unit 11A and/or the imaging unit 12A, and if there is a change in the image, the endoscope system 10A determines in step S5 that the image has changed.
- the data is temporarily recorded in a memory such as the recording unit 16A inside the computer.
- step S3, S7, S13, and S17 are also executed by the image processing circuit in the control unit 11A and/or the imaging unit 12A, and if there is a change, the recording unit 16A in the endoscope system 10A, etc. record in memory. Then, in step S21, when it is determined that the process has ended, all pieces of information recorded in the endoscope system 10A up to that point are collectively transmitted to the auxiliary device 30 (see S23).
- the operation of the endoscope 2 operated by a non-specialist using the second endoscope system 10B until reaching a target site such as an affected area will be described.
- the purpose of this operation is for a non-specialist to use the second endoscope system 10B to perform an examination on the same target area, such as an affected area, on the same subject who was examined by a specialist. shall be.
- the operation information (based on the operation unit information 35b in FIG. 1A etc.) used when the specialist performed the examination is provided to the second endoscope system 10B, and the non-specialist uses the operation information based on this operation information.
- the operation of the endoscope 2 is performed by a control section such as a CPU of a control section 11B in the second endoscope system 10B, which controls each section in the second endoscope system 10B according to a program recorded in the memory. Achieved through control.
- the flow of this endoscope 2 is that for a subject whose organs have been examined using the first endoscope system, the second endoscope system is used to examine the subject's organs to be observed. An endoscopic examination method for observation can be realized.
- the control unit 11B acquires related data such as landmarks and target parts from the auxiliary device 30 through the communication unit 21B.
- the operation unit information 35b is recorded in the recording unit 35 of the auxiliary device 30 (for example, see S23 in FIG. 3). Therefore, the control unit 11B transmits the ID (recorded in the ID management unit 15B) of the subject to be examined (including diagnosis and treatment) using the second endoscope system 10B to the auxiliary device 30. Then, data regarding test marks, target parts, etc. is acquired from the operation unit information 35b corresponding to the subject ID.
- This step can be said to be a step in which time-series operation content information in the endoscope system 10A is acquired as operation unit information. Further, this step S1 can be said to be a step in which time-series operation content information in the first endoscope system is acquired as operation unit information.
- imaging is then started (S33).
- the image sensor in the imaging unit 12B acquires time-series image data at time intervals determined by the frame rate.
- image data inside the body cavity is acquired, and this image data is subjected to image processing by an image processing circuit in the imaging section 12B.
- the display unit 14B displays an image inside the body cavity using the image data that has been subjected to image processing. While viewing this image, the non-specialist operates the endoscope system 10B and moves the distal end toward the position of a target site such as an affected area.
- step S31 the starting point image (starting image 35ba) for each operation unit is acquired, so in this step, the similar image determining section 23B compares the starting point image with the image acquired in the imaging section 12B, and Determine whether or not an image is detected.
- the starting point images are sequentially read out from the input operation unit information, and the read out starting point images are compared with the acquired image to determine whether the starting point image and the acquired image match or are similar. As a result of this determination, if the starting point image is not detected, the process advances to step S53 to determine whether the end point has been reached.
- step S35 if the starting image is detected, then the operation content information (insertion, rotation direction, amount, time) is referred to in chronological order and reference information is displayed (S37). .
- the similar image determination section 23B and the guide section 24B display the operation details (operation guide) in the operation unit corresponding to the starting point image detected in step S35 on the display section 14B.
- step S37 in order to display the operation guide, the similar image determination section 23B first determines the operation state of the second endoscope system 10B based on the image acquired by the imaging section 12B, such as straight insertion operation, rotation operation, etc. Determine the operation status such as operation, bending operation, etc. That is, in this step, it can be said that the operating process when the subject undergoes an examination using the second endoscope system is estimated.
- the similar image determination section 23B may also determine the operation state based on sensor information provided at the distal end of the second endoscope system 10B, etc., and may also determine the operation state of the operation section 17B. The determination may be made based on related information, or may be made by combining these pieces of information.
- the guide unit 24B After determining the operation state, the guide unit 24B then compares the operation state included in the operation unit information input from the auxiliary device 30 with the operation state determined by the similar image determination unit 23B, and based on the comparison result.
- An operation guide is created and displayed on the display section 14B. That is, the operation information recorded in the operation unit information corresponding to the current operation unit and the current operation information of the second endoscope system 10B determined by the similar image determination unit 23B are displayed as reference information. . Non-specialists can learn how to operate the second endoscope 10B by referring to this reference information.
- the operation is almost the same as an examination by a specialist, and the operation is aimed at the target area such as the affected area. It can be said.
- the estimated operation process and operation unit information are compared, and guide information for observing the characteristic parts of the observation target organ under the same observation conditions as the first endoscope system is output. .
- the control unit 11B compares the operation information of the specialist recorded as operation unit information with the state of the operation actually performed by the non-specialist, and determines whether the operation is good or insufficient. Based on the determination result, a pass/fail display is performed on the display section 14B. For example, if a forward bending operation is recorded as the operation unit state, but the result of the determination in step S37 is that a clockwise rotation operation has been performed, the operation state is different. Advice on correcting the operation.
- events such as air supply operation, water injection operation, suction operation, etc. are necessary, and based on the determination results, the operations that should be responded to are determined. indicate.
- other events include operations and processing other than the insertion direction, rotation (vertical relationship), and tip bending, such as the use of treatment instruments, changes in imaging parameters such as exposure and focus, and HDR ( Objects can be discovered by adding some effort, such as image processing such as High Dynamic Range) and depth compositing, switching light sources such as special light observation, image processing that emphasizes specific structures, pigment scattering, staining, etc.
- a landmark image As mentioned above, objects that serve as landmarks when searching for the target are determined in the vicinity of the target, such as the affected area (see landmark Ob in Figure 2), and this landmark image (which may also include position information) is recorded in the operation unit information 35b in the auxiliary device 30. Therefore, in this step, the similar image determination unit 23B compares the landmark image of the object to be a landmark with the current image acquired by the imaging unit 12B, and based on this comparison, determines whether or not a landmark image has been detected. Determine whether
- step S41 if no landmark image is detected, it is determined whether the end point image of the operation unit has been reached (S49). As described above, the end image 35bb is recorded in the recording unit 35 in the auxiliary device 30 for each operation unit. In this step, the similar image determination section 23B compares the end point image (end image 35bb) with the current image acquired by the imaging section 12B, and determines whether or not the end point image has been detected based on this comparison. do.
- step S49 If the result of the determination in step S49 is that it is not the end point image of the operation unit, it is determined whether or not to start over (S51).
- a non-specialist operates the second endoscope 10B aiming at a landmark or a target, but there are cases where he is unable to reach the landmark or target and has to repeat the operation.
- the control unit 11B determines whether a non-specialist is performing the redo operation based on the image acquired by the imaging unit 12B, the operation unit 17B, the sensor output provided in the device, etc. .
- the process returns to step S37 and repeats the above-described operation.
- the process returns to step S35 and the above-described operation is repeated.
- a discovery display is performed (S43).
- the control unit 11B or the guide unit 24B displays on the display unit 14B that a landmark for reaching the target region has been found. This display allows non-specialists to know that a target area such as an affected area exists near the landmark, so they carefully observe the area around the landmark and discover the target area.
- a mark is recorded (S45).
- an image of the landmark, etc. is recorded in the recording section 16B.
- pre-discovery operations are displayed (S45).
- Information about the operations performed by the specialist from finding the landmark to reaching the target site is recorded in the operation unit information 35b of the recording unit 35 (see S19 in FIG. 3). Therefore, in this step, the control section 11B or the guide section 24B performs operation display for guidance based on the recorded pre-discovery operation information.
- the similar image determination section 23B compares the image of the target region with the current image acquired by the imaging section 12B, and based on this comparison, , it is determined whether the target region has been detected. As a result of this determination, if the target region is not found, the process returns to step S47 and a pre-discovery operation display is performed. On the other hand, when the target part is found, the discovery of the target part is displayed, and the target part is recorded.
- step S53 it is determined whether or not the process is finished (S53).
- a non-specialist uses the second endoscope system 10B to determine whether a predetermined examination has been completed. When a target such as an affected area is found and an examination, imaging, etc. are performed, the process may be determined to be finished. Furthermore, if there are multiple target parts, the process may be determined to be finished when the last target part is found. When a non-specialist decides to end the examination, it may be determined that the examination is over. If the result of this determination is that the process has not ended, the process returns to step S35 and the above-described operations are executed. On the other hand, if the result of the determination is that it has ended, an operation to end this flow is performed.
- the endoscope system 10A acquires related data regarding the goal and object during diagnosis/examination (see S31), and the imaging in the second endoscope system 10B is performed.
- the image acquired by the unit 12B is the same as or similar to the starting point image of the operation unit (S35 Yes)
- an operation guide is displayed on the display unit 14B based on the operation content information (see S37).
- the mark of the target region is detected (see S41)
- operations for finding the target region are displayed (S47). That is, since a guide is displayed for each operation unit based on the content of the operation by the specialist, even a non-specialist can easily reach the goal.
- the operation content information recorded in chronological order in the first endoscope system is acquired as "operation unit information" that continues for a predetermined period of time (see S31)
- the operation process in the second endoscope system for the organ to be observed which is the organ of the same subject as in the first endoscope system, is estimated, and the estimated operation process and "operation unit information" are ” and outputs guide information for observing characteristic parts of the organ to be observed under observation conditions similar to those of the first endoscope system (see S35 to S39). Therefore, even a non-specialist can observe the target area, such as an affected area, in the same way as a specialist.
- operation unit information is image change information indicating a succession of the same actions.
- the second internal It can be used as an operation guide when inspecting using the viewing system 10B.
- the asymmetry information of the organ to be observed is determined based on the anatomical positional relationship of multiple parts within the specific organ (rather than the direction of gravity at the time of examination). Since the direction of gravity within a visceral organ is unknown, it is difficult to determine the positional relationships such as up, down, left, and right, but the asymmetry of the organ to be observed can be determined based on the positional relationships of multiple parts within a specific organ.
- an object having the character of a landmark
- the target region such as an affected region
- S41 an object that serves as a landmark
- the detection of the landmark may be omitted and only the detection and determination of a target region such as an affected region may be performed.
- the second endoscope system 10B may be implemented in cooperation with an external device such as the auxiliary device 30.
- the second endoscope system acquires an endoscopic image, transmits the acquired endoscopic image to an external device (including a server, etc.) such as the auxiliary device 30, and sends the acquired endoscopic image to the external device (including a server etc.).
- steps S35 to S51 and the like may be executed, and the second endoscope system 10B may perform display based on the processing results in the external device.
- a second endoscope system is configured including the external device and the second endoscope system 10B.
- one embodiment of the present invention provides a second endoscopic system for observing the target organ of the subject for an examinee who has had an organ examined using the first endoscope system.
- This second endoscope system includes an acquisition unit (for example, communication unit 21B shown in FIG. 1B, S31 in FIG. ), and an insertion operation determination unit that estimates the operation process when a subject undergoes an examination (including diagnosis and treatment) using the second endoscope system (for example, a similar image in The determination unit 23B (see S37 in FIG. 4) compares the operation process estimated by the insertion operation determination unit with the operation unit information, and determines the characteristic part of the organ to be observed under the same observation conditions as the first endoscope system.
- an acquisition unit for example, communication unit 21B shown in FIG. 1B, S31 in FIG.
- an insertion operation determination unit that estimates the operation process when a subject undergoes an examination (including diagnosis and treatment) using the second endoscope system
- the determination unit 23B compares the operation process
- the second endoscope system has an operation guide section (for example, see guide section 24B in FIG. 1B and S37 in FIG. 4) that outputs guide information for observation.
- the above-mentioned operation unit information is image change information indicating a succession of the same motions estimated using the asymmetry of the observed organ.
- the first endoscope system also includes an input unit (for example, input unit 32A in FIG. 1A, S1 in FIG. ), and an operation unit determination unit (for example, operation unit determination unit 36 in FIG. 1A, S11 in FIG. 3) that divides images of organs acquired in time series into operation units and determines the operation performed for each operation unit. , a recording unit (for example, the recording unit 35 in FIG. 1A, S11 in FIG. 3) that records information regarding the image and endoscope operation in this operation unit as operation unit information for each operation unit determined by the operation unit determination unit. and an output section (for example, the communication section 34 in FIG. 1A, S23 in FIG. 3) that outputs the operation unit information recorded in the recording section.
- the first endoscope system allows even a non-specialist to obtain information for observing a target region such as an affected area in the same manner as a specialist.
- an endoscope that is inserted from the oral cavity through the esophagus to examine the stomach or duodenum (including diagnosis and treatment) has been described as an example.
- gastric endoscopes and duodenal endoscopes includes, for example, laryngoscopes, bronchoscopes, cystoscopes, cholangioscopes, angioscopes, upper gastrointestinal endoscopes, duodenoscopes, and small intestine endoscopes.
- an example has been described in which an image obtained using an image sensor is used, but the example is not limited to this, and for example, an image using ultrasound may be used.
- ultrasound images can be used for examination, diagnosis, and treatment of lesions that cannot be observed with optical images from an endoscope, such as the pancreas, pancreatic duct, gallbladder, bile duct, and liver. It's okay.
- the operation unit was determined based on the image (see S7 and S11 in FIG. 3). However, in addition to the image, the determination may be made based on the output of a sensor provided in the endoscope, or the determination may be made based on operation information from the operation section of the endoscope. Also, in the second endoscope system 10B, the operation unit may be determined based on other information other than images, similar to the first endoscope 10A.
- the invention is not limited to flexible endoscopes; even so-called rigid endoscopes can be inserted and rotated, and the present invention can also be used as a guide during these operations.
- the insertion angle is an additional factor when inserted into a body cavity, but the invention described in this application can be applied to this as well if the operation guide is based on, for example, when the rigid scope is inserted approximately vertically. This is because it is possible to determine whether the angle at the time of insertion has changed based on the image obtained at the time of insertion.
- logic-based determination and inference-based determination have been described, but in this embodiment, either logic-based determination or inference-based determination is selected as appropriate. It may also be used as such. Further, in the process of determination, a hybrid type determination may be performed by partially utilizing the merits of each.
- control units 11A, 11B, and 31 have been described as devices composed of a CPU, memory, and the like.
- some or all of each part may also be configured as hardware circuits, such as those written in Verilog, VHDL (Verilog Hardware Description Language), etc.
- a hardware configuration such as a gate circuit generated based on a programming language may be used, or a hardware configuration using software such as a DSP (Digital Signal Processor) may be used. Of course, these may be combined as appropriate.
- control units 11A, 11B, and 31 are not limited to CPUs, and may be any element that functions as a controller, and the processing of each unit described above may be performed by one or more processors configured as hardware. good.
- each unit may be a processor configured as an electronic circuit, or each unit may be a circuit unit in a processor configured with an integrated circuit such as an FPGA (Field Programmable Gate Array).
- a processor including one or more CPUs may execute the functions of each unit by reading and executing a computer program recorded on a recording medium.
- the auxiliary device 30 includes a control section 31, an input section 32, an ID management section 33, a communication section 34, a recording section 35, an operation unit determination section 36, and an inference engine 37. I explained it as if it were there. However, these need not be provided in a single device; for example, the above-mentioned units may be distributed as long as they are connected via a communication network such as the Internet.
- the endoscope system 10A and the second endoscope system 10B include control units 11A, 11B, imaging units 12A, 12B, light source units 13A, 13B, display units 14A, 14B, ID management units 15A, 15B,
- the explanation has been made assuming that the recording sections 16A and 16B, the operation sections 17A and 17B, the inference engine 18A, the clock section 20A, the communication sections 21A and 21B, the signal output section 22B, the similar image determination section 23B, and the guide section 24B are included. However, these do not need to be provided in an integrated device, and each part may be distributed.
- control mainly explained in the flowcharts can often be set by a program, and may be stored in a recording medium or a recording unit.
- the method of recording on this recording medium and recording unit may be recorded at the time of product shipment, a distributed recording medium may be used, or a recording medium may be downloaded from the Internet.
- the present invention is not limited to the above-mentioned embodiment as it is, and can be embodied by modifying the constituent elements within the scope of the invention at the implementation stage.
- various inventions can be formed by appropriately combining the plurality of components disclosed in the above embodiments. For example, some components may be deleted from all the components shown in the embodiments. Furthermore, components of different embodiments may be combined as appropriate.
- ...Signal output unit 23B...Similar image determination unit, 24B...Guide unit, 30...Auxiliary device, 31...Control unit, 32...Input unit, 33...ID management unit , 34...Communication section, 35...Recording section, 35a...Inspection image, 35b...Operation unit information, 35ba...Start image, 35bb...End image, 35bc...Operation information , 35be... Time information, 36... Operation unit information, 37... Inference engine
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Endoscopes (AREA)
Abstract
La présente invention concerne un second système endoscopique, un premier système endoscopique et un procédé d'inspection par un endoscope, qui permettent un accès facile à une partie cible telle qu'une partie affectée. L'invention concerne un procédé d'inspection endoscopique faisant appel à un second système endoscopique sur un sujet dont l'organe a été inspecté à l'aide d'un premier système endoscopique pour observer un organe du sujet devant être observé, le procédé comprenant : l'acquisition d'informations de contenu de l'opération en série chronologique dans le premier système endoscopique en tant qu'informations de l'ensemble d'opération (S31) ; l'estimation d'un processus d'opération d'un cas où le sujet est inspecté à l'aide du second système endoscopique ; la comparaison du processus d'opération estimé et les informations de l'ensemble d'opération ; et la livraison en sortie des informations de guidage de l'opération destinées à l'utilisation du second système endoscopique afin d'observer une partie caractéristique dans l'organe à observer par le second système endoscopique (S37). Les informations de l'ensemble d'opération sont des informations de changement d'image estimées à l'aide de la propriété asymétrique de l'organe à observer.
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202280095728.XA CN119183359A (zh) | 2022-05-10 | 2022-05-10 | 第二内窥镜系统、第一内窥镜系统及内窥镜检查方法 |
| PCT/JP2022/019794 WO2023218523A1 (fr) | 2022-05-10 | 2022-05-10 | Second système endoscopique, premier système endoscopique et procédé d'inspection endoscopique |
| JP2024520115A JPWO2023218523A1 (fr) | 2022-05-10 | 2022-05-10 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2022/019794 WO2023218523A1 (fr) | 2022-05-10 | 2022-05-10 | Second système endoscopique, premier système endoscopique et procédé d'inspection endoscopique |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023218523A1 true WO2023218523A1 (fr) | 2023-11-16 |
Family
ID=88729976
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/019794 Ceased WO2023218523A1 (fr) | 2022-05-10 | 2022-05-10 | Second système endoscopique, premier système endoscopique et procédé d'inspection endoscopique |
Country Status (3)
| Country | Link |
|---|---|
| JP (1) | JPWO2023218523A1 (fr) |
| CN (1) | CN119183359A (fr) |
| WO (1) | WO2023218523A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025134330A1 (fr) * | 2023-12-21 | 2025-06-26 | オリンパスメディカルシステムズ株式会社 | Dispositif de commande de mise au point d'endoscope, procédé de commande de mise au point et programme de commande de mise au point |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08280604A (ja) * | 1994-08-30 | 1996-10-29 | Vingmed Sound As | 内視鏡検査または胃鏡検査用装置 |
| JP2005077831A (ja) * | 2003-09-01 | 2005-03-24 | Olympus Corp | 工業用内視鏡装置及びこれを用いた検査方法 |
| JP2008005923A (ja) * | 2006-06-27 | 2008-01-17 | Olympus Medical Systems Corp | 医用ガイドシステム |
| JP2014528794A (ja) * | 2011-09-30 | 2014-10-30 | ルフトハンザ・テッヒニク・アクチェンゲゼルシャフトLufthansa Technik Ag | ガスタービンを検査するための内視鏡検査システムおよび対応する方法 |
| JP2017059870A (ja) * | 2015-09-14 | 2017-03-23 | オリンパス株式会社 | 撮像操作ガイド装置および撮像装置の操作ガイド方法 |
| JP2019153874A (ja) * | 2018-03-01 | 2019-09-12 | オリンパス株式会社 | 情報記録装置、画像記録装置、操作補助装置、操作補助システム、情報記録方法、画像記録方法及び操作補助方法 |
-
2022
- 2022-05-10 JP JP2024520115A patent/JPWO2023218523A1/ja active Pending
- 2022-05-10 CN CN202280095728.XA patent/CN119183359A/zh active Pending
- 2022-05-10 WO PCT/JP2022/019794 patent/WO2023218523A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08280604A (ja) * | 1994-08-30 | 1996-10-29 | Vingmed Sound As | 内視鏡検査または胃鏡検査用装置 |
| JP2005077831A (ja) * | 2003-09-01 | 2005-03-24 | Olympus Corp | 工業用内視鏡装置及びこれを用いた検査方法 |
| JP2008005923A (ja) * | 2006-06-27 | 2008-01-17 | Olympus Medical Systems Corp | 医用ガイドシステム |
| JP2014528794A (ja) * | 2011-09-30 | 2014-10-30 | ルフトハンザ・テッヒニク・アクチェンゲゼルシャフトLufthansa Technik Ag | ガスタービンを検査するための内視鏡検査システムおよび対応する方法 |
| JP2017059870A (ja) * | 2015-09-14 | 2017-03-23 | オリンパス株式会社 | 撮像操作ガイド装置および撮像装置の操作ガイド方法 |
| JP2019153874A (ja) * | 2018-03-01 | 2019-09-12 | オリンパス株式会社 | 情報記録装置、画像記録装置、操作補助装置、操作補助システム、情報記録方法、画像記録方法及び操作補助方法 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025134330A1 (fr) * | 2023-12-21 | 2025-06-26 | オリンパスメディカルシステムズ株式会社 | Dispositif de commande de mise au point d'endoscope, procédé de commande de mise au point et programme de commande de mise au point |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2023218523A1 (fr) | 2023-11-16 |
| CN119183359A (zh) | 2024-12-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6389136B2 (ja) | 内視鏡撮影部位特定装置、プログラム | |
| US20220361739A1 (en) | Image processing apparatus, image processing method, and endoscope apparatus | |
| JP2010279539A (ja) | 診断支援装置および方法並びにプログラム。 | |
| WO2023095208A1 (fr) | Dispositif de guidage d'insertion d'endoscope, procédé de guidage d'insertion d'endoscope, procédé d'acquisition d'informations d'endoscope, dispositif de serveur de guidage et procédé d'apprentissage de modèle d'inférence d'image | |
| JP7081862B1 (ja) | 手術支援システム、手術支援方法、及び手術支援プログラム | |
| KR20170055526A (ko) | 방광의 진단 맵핑을 위한 방법 및 시스템 | |
| WO2021075418A1 (fr) | Procédé de traitement d'image, procédé de génération de données d'apprentissage, procédé de génération de modèle entraîné, procédé de prédiction d'apparition de maladie, dispositif de traitement d'image, programme de traitement d'image et support d'enregistrement sur lequel un programme est enregistré | |
| JP2009022446A (ja) | 医療における統合表示のためのシステム及び方法 | |
| US12133635B2 (en) | Endoscope processor, training device, information processing method, training method and program | |
| WO2023218523A1 (fr) | Second système endoscopique, premier système endoscopique et procédé d'inspection endoscopique | |
| CN100563550C (zh) | 医用图像处理装置 | |
| EP4434435A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement | |
| EP4497369A1 (fr) | Procédé et système d'analyse et de manipulation d'imagerie endoscopique médicale | |
| JP7441452B2 (ja) | 教師データ生成方法、学習済みモデル生成方法、および発病予測方法 | |
| JP7561382B2 (ja) | 大腸内視鏡観察支援装置、作動方法、及びプログラム | |
| WO2023282144A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, système d'endoscope et dispositif d'aide à la préparation de rapport | |
| WO2023013080A1 (fr) | Procédé d'aide à l'annotation, programme d'aide à l'annotation, et dispositif d'aide à l'annotation | |
| JP7533905B2 (ja) | 大腸内視鏡観察支援装置、作動方法、及びプログラム | |
| WO2025141691A1 (fr) | Dispositif de détection d'objet, procédé de détection d'objet, programme de détection d'objet et système d'endoscope | |
| JP7264407B2 (ja) | 訓練用の大腸内視鏡観察支援装置、作動方法、及びプログラム | |
| JP7600247B2 (ja) | 学習装置、学習方法、プログラム、学習済みモデル、及び内視鏡システム | |
| US12333727B2 (en) | Endoscope apparatus, information processing method, and storage medium | |
| WO2024176780A1 (fr) | Dispositif d'assistance médicale, endoscope, procédé d'assistance médicale, et programme | |
| WO2024095673A1 (fr) | Dispositif d'assistance médicale, endoscope, méthode d'assistance médicale et programme | |
| WO2024048098A1 (fr) | Dispositif d'assistance médicale, endoscope, méthode d'assistance médicale et programme |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22941605 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2024520115 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22941605 Country of ref document: EP Kind code of ref document: A1 |