CN119632476B - Endoscopic guidance methods, devices, storage media and equipment - Google Patents
Endoscopic guidance methods, devices, storage media and equipmentInfo
- Publication number
- CN119632476B CN119632476B CN202311201443.2A CN202311201443A CN119632476B CN 119632476 B CN119632476 B CN 119632476B CN 202311201443 A CN202311201443 A CN 202311201443A CN 119632476 B CN119632476 B CN 119632476B
- Authority
- CN
- China
- Prior art keywords
- propulsion
- advancement
- target
- bifurcation
- endoscope
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Endoscopes (AREA)
Abstract
The invention relates to an endoscope propulsion guiding method, device, storage medium and equipment, which comprise the steps of obtaining the propulsion position of an endoscope in the propulsion process and collected endoscope images, correcting the accumulated error of the propulsion position according to a central line network corresponding to a region of interest to obtain a corrected propulsion position, inputting the endoscope images into a pre-trained bifurcation prediction neural network model to obtain an output bifurcation prediction result, predicting the residual propulsion path, the residual propulsion time and a target propulsion branch at a bifurcation port from the corrected propulsion position to a pre-selected propulsion terminal point according to the central line network and the bifurcation prediction result, marking the target propulsion branch and a non-target branch of a guiding bifurcation port, and performing propulsion guiding according to the residual propulsion path, the residual propulsion time and the bifurcation port marked by the branch port. The endoscope is guided to advance in real time, so that the endoscope is prevented from being advanced to the wrong branch. The accuracy of endoscope propulsion is improved.
Description
Technical Field
The present disclosure relates to the field of image processing technology, and in particular, to an endoscope advancement guiding method, apparatus, storage medium, and device.
Background
Human tissue is not rigid, and at different moments, deformation such as stretching, shrinkage, displacement and the like can be shown, so that the endoscope is blocked in the pushing process, and therefore, not only is the pushing guide carried out in the pushing process of the endoscope needed, but also the position of the endoscope in the human body is determined in real time, and the endoscope is prevented from being pushed to wrong tissues or organs.
Disclosure of Invention
In view of the above technical problems, an object of the present disclosure is to provide an endoscope advancement guide method, apparatus, storage medium, and device.
To achieve the above object, a first aspect of embodiments of the present disclosure provides an endoscope advancement guide method, including:
Acquiring an advancing position of an image part of the endoscope in an advancing process and an endoscope image acquired by the image part;
correcting the accumulated error of the propulsion position according to a central line network corresponding to the region of interest to obtain a corrected propulsion position;
inputting the endoscope image into a pre-trained bifurcation prediction neural network model to obtain a bifurcation prediction result output by the bifurcation prediction neural network model;
predicting a remaining propulsion path, a remaining propulsion time and a target propulsion leg at a bifurcation from the corrected propulsion location to a preselected propulsion destination based on the centreline network and the bifurcation prediction;
marking target propulsion branches and non-target branches of the bifurcation, and performing propulsion guidance according to the remaining propulsion distance, the remaining propulsion time and the bifurcation marked by the branches.
Optionally, the correcting the accumulated error of the pushing position according to the centerline network corresponding to the region of interest to obtain a corrected pushing position includes:
Determining a target key node corresponding to the propulsion position from key nodes of the central line network corresponding to the region of interest;
and correcting the accumulated error of the propulsion position according to the node position corresponding to the target key node to obtain the corrected propulsion position.
Optionally, the predicting, according to the central line network and the bifurcation predicting result, a remaining propulsion path, a remaining propulsion time and a target propulsion branch at a bifurcation port from the corrected propulsion position to a pre-selected propulsion destination includes:
Predicting a remaining propulsion path along the centerline network from the corrected propulsion location to a preselected propulsion destination if the bifurcation prediction characterizes the presence of a bifurcation ahead of propulsion of the endoscope;
Predicting the residual propulsion time from the corrected propulsion position to a preselected propulsion end according to the residual propulsion distance and the average speed of the endoscope in the current propulsion process;
A target propulsion branch from the corrected propulsion position to a preselected propulsion destination is determined from each branch of the bifurcation characterized by the bifurcation prediction.
Optionally, the centerline network corresponding to the region of interest is established by:
constructing a three-dimensional model aiming at the region of interest according to an original medical image of the region of interest to be checked;
image segmentation is carried out on the three-dimensional model to obtain a pipeline network corresponding to a pipeline system of each pipeline type;
Extracting a central line of a pipeline network corresponding to each pipeline type to obtain a central line network corresponding to the pipeline network of each pipeline type, wherein the line width of the central line in the central line network is a volume element, and a unique passage exists between any two volume elements on the central line.
Optionally, the image segmentation is performed on the three-dimensional model to obtain a pipeline network corresponding to a pipeline system of each pipeline type, including:
defining a target region including the region of interest in the three-dimensional stereomodel;
According to the input exposure value range, scanning volume elements of the three-dimensional model in the target area to obtain volume elements corresponding to the target area;
searching volume elements with communication relations in the three-dimensional model by taking input seed points as starting points according to the exposure value range, and adding display labels to the volume elements with the communication relations;
hiding the volume elements without display labels in the three-dimensional model to obtain a pipeline network corresponding to the pipeline system of each pipeline type.
Optionally, the marking the target pushing branch and the non-target branch of the guiding bifurcation includes:
Marking the target propulsion branch of the guiding bifurcation by presetting a first color and a first preset marking pattern;
and marking the non-target pushing branch of the guide bifurcation by presetting a second color and a second preset marking pattern.
In a second aspect of embodiments of the present disclosure, there is provided an endoscope advancing guide device including:
an acquisition module configured to acquire a pushing position of an image portion of the endoscope during pushing and an endoscope image acquired by the image portion;
The correction module is configured to correct the accumulated error of the propulsion position according to the central line network corresponding to the region of interest to obtain a corrected propulsion position;
The input module is configured to input the endoscope image into a pre-trained bifurcation prediction neural network model to obtain a bifurcation prediction result output by the bifurcation prediction neural network model;
A prediction module configured to predict a remaining propulsion path from the corrected propulsion location to a preselected propulsion destination, a remaining propulsion time, and a target propulsion leg at a bifurcation based on the centerline network and the bifurcation prediction;
The display module is configured to mark the target propulsion branch and the non-target branch of the bifurcation, and to perform propulsion guidance according to the remaining propulsion distance, the remaining propulsion time and the bifurcation marked by the branches.
Optionally, the correction module is configured to:
Determining a target key node corresponding to the propulsion position from key nodes of the central line network corresponding to the region of interest;
and correcting the accumulated error of the propulsion position according to the node position corresponding to the target key node to obtain the corrected propulsion position.
Optionally, the prediction module is configured to:
Predicting a remaining propulsion path along the centerline network from the corrected propulsion location to a preselected propulsion destination if the bifurcation prediction characterizes the presence of a bifurcation ahead of propulsion of the endoscope;
Predicting the residual propulsion time from the corrected propulsion position to a preselected propulsion end according to the residual propulsion distance and the average speed of the endoscope in the current propulsion process;
A target propulsion branch from the corrected propulsion position to a preselected propulsion destination is determined from each branch of the bifurcation characterized by the bifurcation prediction.
Optionally, the correction module is further configured to establish a centerline network corresponding to the region of interest by:
constructing a three-dimensional model aiming at the region of interest according to an original medical image of the region of interest to be checked;
image segmentation is carried out on the three-dimensional model to obtain a pipeline network corresponding to a pipeline system of each pipeline type;
Extracting a central line of a pipeline network corresponding to each pipeline type to obtain a central line network corresponding to the pipeline network of each pipeline type, wherein the line width of the central line in the central line network is a volume element, and a unique passage exists between any two volume elements on the central line.
Optionally, the correction module is further configured to:
defining a target region including the region of interest in the three-dimensional stereomodel;
According to the input exposure value range, scanning volume elements of the three-dimensional model in the target area to obtain volume elements corresponding to the target area;
searching volume elements with communication relations in the three-dimensional model by taking input seed points as starting points according to the exposure value range, and adding display labels to the volume elements with the communication relations;
hiding the volume elements without display labels in the three-dimensional model to obtain a pipeline network corresponding to the pipeline system of each pipeline type.
Optionally, the display module is configured to:
Marking the target propulsion branch of the guiding bifurcation by presetting a first color and a first preset marking pattern;
and marking the non-target pushing branch of the guide bifurcation by presetting a second color and a second preset marking pattern.
A third aspect of embodiments of the present disclosure provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a first processor, implements the steps of the endoscope advancement guide method of any one of the first aspects.
In a fourth aspect of embodiments of the present disclosure, there is provided an electronic device, including:
A first memory having a computer program stored thereon;
A second processor for executing the computer program in the first memory to implement the steps of the endoscope advancement guidance method of any one of the first aspects.
Through the technical scheme, at least the following beneficial effects can be achieved:
The method comprises the steps of obtaining a propelling position of an image part of an endoscope in a propelling process and an endoscope image acquired by the image part, correcting accumulated errors of the propelling position according to a central line network corresponding to a region of interest to obtain a corrected propelling position, avoiding that errors of the propelling position are accumulated continuously to cause that the accurate position of the endoscope cannot be accurately determined, inputting the endoscope image into a pre-trained bifurcation prediction neural network model to obtain a bifurcation prediction result output by the bifurcation prediction neural network model, predicting residual propelling distance, residual propelling time and a target propelling branch at a bifurcation port from the corrected propelling position to a pre-selected propelling end point according to the central line network and the bifurcation prediction result, marking the target propelling branch and a non-target branch of the bifurcation port, and conducting propelling guidance according to the residual propelling distance, the residual propelling time and the bifurcation port after the branch marking. The endoscope can be guided to advance in real time, so that the endoscope is prevented from being advanced to the wrong branch. The accuracy of the advancement of the endoscope to the preselected advancement terminus is improved.
Additional features and advantages of the present disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification, illustrate the disclosure and together with the description serve to explain, but do not limit the disclosure. In the drawings:
Fig. 1 is a flowchart illustrating an endoscope advancement guide method according to an embodiment of the present disclosure.
Fig. 2 is a flow chart illustrating a method of implementing step S12 of fig. 1, according to an embodiment of the present disclosure.
Fig. 3 is a flow chart illustrating a method of implementing step S14 of fig. 1, according to an embodiment of the present disclosure.
Fig. 4 is a flow chart illustrating a method of establishing a centerline network for a region of interest according to an embodiment of the present disclosure.
Fig. 5 is a flowchart illustrating a method of implementing step S402 of fig. 4, according to an embodiment of the present disclosure.
Fig. 6 is a block diagram of an endoscopic push guiding device according to an embodiment of the present disclosure.
Fig. 7 is a block diagram of an electronic device, shown in accordance with an embodiment of the present disclosure.
Detailed Description
Specific embodiments of the present disclosure are described in detail below with reference to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating and illustrating the disclosure, are not intended to limit the disclosure.
Before describing the method, the device, the storage medium and the equipment for guiding the endoscope in advance, which are provided by the present disclosure, the application scene of the present disclosure is described first, and the endoscope may encounter a bifurcation in the advancing process, so that the advancing direction cannot be determined.
For this reason, the embodiments of the present disclosure provide an endoscope propulsion guiding method, which aims to guide the endoscope propulsion in real time, and mark and guide when encountering a bifurcation, so as to avoid the endoscope from being propelled to an incorrect branch. The accuracy of advancement of the endoscope to a preselected advancement endpoint is improved, as shown in fig. 1, and the method includes the following steps.
In step S11, a pushing position of the image portion of the endoscope during pushing and an endoscope image acquired by the image portion are acquired.
In the embodiment of the disclosure, the image portion of the endoscope may be advanced according to a pre-planned advancing path, so that in the advancing process, images of the passing tissue and the duct are acquired in real time through the image portion of the endoscope, and an endoscope image is obtained. And the advanced position of the image portion of the endoscope during the advancement is acquired by an inertial sensor disposed in the image portion of the endoscope.
The method provided by the disclosure can only determine whether the image part of the endoscope advances according to a pre-planned advancing path according to the advancing position of the image part of the endoscope, and can prompt if the endoscope does not advance according to the pre-planned advancing path in the advancing process.
In step S12, the accumulated error of the propulsion position is corrected according to the centerline network corresponding to the region of interest, so as to obtain a corrected propulsion position.
It may be noted that, because the tissue is not rigid, deformation such as stretching, shrinkage, displacement, etc. may be exhibited in different states, resulting in deviation of the advanced position of the endoscope from the stored history data. In some extreme cases, it is also possible that the stored historical data differs from the actual surgery or examination. Therefore, analysis must be performed on the actually acquired propulsion position, if errors obviously deviate from a reasonable range, a serious abnormal risk exists, and not only the propulsion position needs to be corrected, but also an alarm prompt can be given.
In the embodiment of the disclosure, the target position of the endoscope may be determined from the centerline network corresponding to the region of interest, and then the advanced position of the endoscope may be corrected according to the target position, for example, in the case that the euclidean distance between the target position and the advanced position of the endoscope exceeds the preset distance threshold, the advanced position of the endoscope is replaced with the target position in the centerline network, that is, the target position in the centerline network is taken as the corrected destatic position. Under the condition that the Euclidean distance between the target position and the pushing position of the endoscope does not exceed the preset distance threshold, the pushing position of the endoscope is directly used as the corrected pushing position, so that the continuous accumulated error of the pushing position of the endoscope in the pushing process can be avoided, and the continuous increase of the error between the pushing position of the endoscope and the actual position is caused.
In step S13, the endoscopic image is input into a pre-trained bifurcation prediction neural network model, and a bifurcation prediction result output by the bifurcation prediction neural network model is obtained.
In an embodiment of the present disclosure, the bifurcation predictive neural network model may be, for example, a Yolo model, in which the original Yolo model may be trained by annotating an endoscopic image with bifurcation, resulting in a set of model parameters. During operation or examination, an endoscope image acquired in real time by an image part of the endoscope is input into a Yolo model after training, and a Yolo model can give a prediction result of whether bifurcation exists in the endoscope image. In addition, under the condition that bifurcation exists, the prediction result can also mark the direction and the branch size of each branch of the bifurcation. Wherein, yolo model can realize the furcation prediction fast and accurately.
In step S14, the remaining propulsion path from the corrected propulsion position to the preselected propulsion destination, the remaining propulsion time, and the target propulsion leg at the bifurcation are predicted based on the centerline network and the bifurcation prediction result.
In the embodiment of the disclosure, the remaining propulsion path to the preselected propulsion destination may be calculated along the central line network according to the corrected propulsion position, and may be error predicted due to deformation of the tissue such as stretching, shrinking, displacement, etc. And further predicting a remaining propulsion time from the corrected propulsion position to the preselected propulsion destination based on the remaining propulsion distance and the preset propulsion direction and propulsion speed.
In the embodiment of the disclosure, if the bifurcation predicting result indicates that there is no bifurcation in front of propulsion, only one target propulsion branch may be directly displayed, and if the bifurcation predicting result indicates that there is bifurcation in front of propulsion, a target propulsion branch from a corrected propulsion position to a preselected propulsion destination is predicted. It is easy to understand that the target propulsion branch may be a plurality of optimal propulsion branches, so that a plurality of target propulsion branches may be displayed, and each target propulsion branch may be preferentially selected, one of the optimal target propulsion branches is displayed, and the other branches are all non-target propulsion branches.
In step S15, the target propulsion branch and the non-target branch of the branch are marked, and propulsion guidance is performed according to the remaining propulsion distance, the remaining propulsion time and the branch marked branch.
In the embodiment of the disclosure, the remaining propulsion distance and the remaining propulsion time can be broadcasted and guided through voice, for example, and then displayed and guided through display on a user interface. Similarly, voice guidance can be performed on the marked target pushing branch and the marked non-target branch through voice on the branch marked bifurcation, target pushing branch display guidance is performed through the marked color or mark displayed on the user interface, and non-target branch display guidance is performed through the marked non-marked color or mark displayed on the user interface.
In the embodiment of the disclosure, based on the displayed endoscope image, a doctor can temporarily change the advancing direction and the advancing speed of the endoscope, for example, find a new focus, and can temporarily change the advancing path. And then, according to the current propelling position of the image part of the endoscope, a plurality of optimal path options from the propelling position to a preselected propelling destination are planned in real time based on preset constraint conditions, and the optimal path options are displayed.
The technical scheme includes that the pushing position of an image part of an endoscope in the pushing process and an endoscope image acquired by the image part are acquired, the accumulated errors of the pushing position are corrected according to a central line network corresponding to a region of interest to obtain a corrected pushing position, the situation that the errors of the pushing position are accumulated continuously to enable the accurate position of the endoscope to be not determined accurately can be avoided, further, the endoscope image is input into a pre-trained bifurcation prediction neural network model to obtain a bifurcation prediction result output by the bifurcation prediction neural network model, the remaining pushing distance from the corrected pushing position to a pre-selected pushing end point, the remaining pushing time and a target pushing branch at a bifurcation opening are predicted according to the central line network and the bifurcation prediction result, the target pushing branch and a non-target branch which guide the bifurcation opening are marked, and pushing guidance is carried out according to the remaining pushing distance, the remaining pushing time and the bifurcation opening marked by the branch opening. The endoscope can be guided to advance in real time, so that the endoscope is prevented from being advanced to the wrong branch. The accuracy of the advancement of the endoscope to the preselected advancement terminus is improved.
In the embodiment of the present disclosure, referring to fig. 2, in step S12, the correcting, according to the centerline network corresponding to the region of interest, the accumulated error of the propulsion position to obtain a corrected propulsion position includes:
In step S121, a target key node corresponding to the propulsion position is determined from the key nodes of the centerline network corresponding to the region of interest.
In the embodiment of the disclosure, the key nodes of the centerline network corresponding to the region of interest may be determined according to the type of the target tissue, the outline of the tissue, the characteristic size of the tissue, the normal or abnormal index of the tissue, the bifurcation, and the like. The key nodes have more obvious characteristics and can be obviously seen from the endoscope image.
Further, the target key node may be the key node of the last path of the advanced position, or may be the key node shown in the current endoscope image.
In step S122, the accumulated error of the propulsion position is corrected according to the node position corresponding to the target key node, so as to obtain a corrected propulsion position.
In the embodiment of the disclosure, the distance between the node position corresponding to the target key node and the propulsion position can be calculated, and the propulsion position is replaced by the node position under the condition that the distance exceeds a preset threshold value, so as to obtain the corrected propulsion position. In this way, the node position is used as the starting point of the next section of propulsion every time in the propulsion process, and the influence of the error of the previous propulsion position on the subsequent propulsion position can be avoided.
In the embodiment of the present disclosure, referring to fig. 3, in step S14, the predicting, according to the central line network and the bifurcation predicting result, a remaining propulsion path from the corrected propulsion position to a pre-selected propulsion destination, a remaining propulsion time, and a target propulsion branch at a bifurcation, includes:
In step S141, in the case where the bifurcation prediction result characterizes that a bifurcation port exists in front of the advancement of the endoscope, the remaining advancement path from the corrected advancement position to the preselected advancement end is predicted along the center line network.
In embodiments of the present disclosure, where the bifurcation prediction results characterize the presence of a bifurcation in front of the advancement of the endoscope, a plurality of remaining advancement routes from the corrected advancement location to a preselected advancement endpoint are predicted along the centerline network. Each remaining propulsion path corresponds to a bifurcation branch, facilitating selection of an appropriate target propulsion branch from among the different remaining propulsion paths. For example, in the case of a bifurcation, the cost of the remaining propulsion path of the propulsion branch may be referenced for each entry in the case of a lesion tissue that is to be bypassed, avoiding the bypass cost to be excessive, resulting in unreasonable surgery or examination.
In step S142, the remaining propulsion time from the corrected propulsion position to the pre-selected propulsion end is predicted based on the remaining propulsion path and the average speed of the endoscope during the present propulsion.
Because of tissue difference of different human bodies, such as the aperture size of a human body pipeline, if the calculated residual propulsion time is calculated according to the preset propulsion direction and the propulsion speed, a larger error may exist in the calculated residual propulsion time, so that the residual propulsion time from the corrected propulsion position to the preselected propulsion terminal point can be predicted according to the average speed of the endoscope in the propulsion process, and the accuracy of calculating the residual propulsion time is improved.
It should be noted that, the average speed of the endoscope during the present pushing process is calculated according to the accumulated time and the accumulated path of the endoscope under the condition that the endoscope is pushed forward, and if the endoscope stops to observe at any position, the observed time length is not recorded in the accumulated time.
In step S143, a target propulsion branch from the corrected propulsion position to a preselected propulsion destination is determined from the branches of the bifurcation characterized by the bifurcation prediction result.
Wherein the target propulsion branch from the corrected propulsion position to the preselected propulsion destination may be one or more.
In the embodiment of the present disclosure, referring to fig. 4, a centerline network corresponding to a region of interest is established by:
in step S401, a three-dimensional stereoscopic model for the region of interest is constructed from the original medical image of the region of interest to be examined.
In the embodiment of the disclosure, a three-dimensional stereoscopic model for a region of interest can be constructed for original medical images obtained by shooting the region of interest to be inspected from different angles. Or other prior art techniques, to construct three-dimensional models, are not described in detail herein.
In step S402, image segmentation is performed on the three-dimensional model to obtain a pipeline network corresponding to the pipeline system of each pipeline type.
In the embodiment of the disclosure, a traditional image segmentation algorithm based on morphological operation can be used, and a neural network image segmentation algorithm based on a deep learning network, such as a UNet network, can be used to segment the three-dimensional model to obtain a pipeline network corresponding to each pipeline type pipeline system in the human body pipeline system.
In step S403, a central line of the pipeline network corresponding to each pipeline type is extracted, so as to obtain a central line network corresponding to the pipeline network of each pipeline type.
The pipeline can comprise pipelines which exist in natural physiology of a human body, various sinus tracts formed by pathological factors, an operation path formed by dissecting human tissues for reaching an operation area and the like.
The line width of the central line in the central line network is one volume element, and a unique passage exists between any two volume elements on the central line.
In the embodiment of the disclosure, for the obtained pipeline network, the central line of the pipeline network can be extracted through a central line extraction algorithm, and the obtained central line network and the pipeline network have equivalence in terms of topological structure, positioning function and the like. Thereby simplifying the network structure. The central line network is characterized in that 1) the line width of the central line is a volume element, and 2) a unique and non-repeated passage exists between any two volumes on the central line network.
According to the technical scheme, the constructed three-dimensional model can be subjected to image segmentation, the pipeline network is extracted, and then the central line network corresponding to the pipeline network is determined, so that the influence among different types of pipelines in the image is reduced.
In the embodiment of the present disclosure, referring to fig. 5, in step S402, the image segmentation is performed on the three-dimensional stereo model to obtain a pipeline network corresponding to a pipeline system of each pipeline type, including:
in step S4021, a target region including a region of interest is delineated in the three-dimensional stereoscopic model.
In the embodiment of the disclosure, a three-dimensional area is defined as a target area from the space range where the three-dimensional stereo model is located, and the target area includes a focus area of a patient, wherein the shape of the target area may be a cube, an oblique cube, a cylinder, an oblique cylinder, or other irregular three-dimensional body.
Further, in the three-dimensional model, a hidden label is added to other volume elements located outside the target area, and when the three-dimensional model is presented, the volume elements marked with the hidden label are hidden, i.e. the volume elements marked with the hidden label are not displayed.
In step S4022, according to the input exposure value range, the three-dimensional model in the target area is scanned for volume elements, so as to obtain volume elements corresponding to the target area.
It can be understood that the exposure value range includes a lower threshold and an upper threshold, a volume element of the three-dimensional model in the target area is scanned, a hidden label is added to a volume element of the target area, the exposure value of which falls outside the exposure value range, and then the volume element marked with the hidden label in the target area is hidden.
In step S4023, the input seed point is used as a starting point, and according to the exposure value range, the volume elements with the connected relation in the three-dimensional model are searched, and the display label is added to the volume elements with the connected relation.
In the embodiment of the disclosure, the input seed points are input by a user according to the operation requirement in real time, and when the method is implemented, a display label can be added to volume elements with communication relations, and a hidden label can be added to volume elements without communication relations.
In step S4024, hiding the volume element in the three-dimensional model, to which the display tag is not added, to obtain a pipeline network corresponding to the pipeline system of each pipeline type.
In the embodiment of the disclosure, the volume element marked with the hidden tag can be hidden, and the volume element marked with the display tag is displayed to obtain the pipeline network corresponding to the pipeline system of each pipeline type in the human body pipeline system.
Optionally, in step S15, the marking the target pushing branch and the non-target branch of the guiding bifurcation includes:
And marking the target propulsion branch of the guiding bifurcation by presetting a first color and a first preset marking pattern.
And marking the non-target pushing branch of the guide bifurcation by presetting a second color and a second preset marking pattern.
In this embodiment of the present disclosure, the first preset marking pattern may be to add an arrow guide to the branch, and further may be advanced by a voice prompt according to an arrow of a preset first color. The second preset marking pattern may be the same as the first preset marking pattern, and only the display target propulsion branch and the non-target propulsion branch may be distinguished by color, or may be different from the first preset marking pattern, and further the display target propulsion branch and the non-target propulsion branch may be distinguished by different colors and marking patterns.
Further, in the case that a plurality of target pushing branches exist, the target pushing branches can be marked by different colors, wherein the striking degree of the preset first color is higher than that of the preset second color.
The presently disclosed embodiments also provide an endoscope advancing guide device, referring to fig. 6, an endoscope advancing guide device 600 includes:
an acquisition module 610 configured to acquire a pushing position of an image portion of the endoscope during pushing and an endoscope image acquired by the image portion;
The correction module 620 is configured to correct the accumulated error of the propulsion position according to the centerline network corresponding to the region of interest, so as to obtain a corrected propulsion position;
An input module 630 configured to input the endoscope image into a pre-trained bifurcation prediction neural network model, to obtain a bifurcation prediction result output by the bifurcation prediction neural network model;
a prediction module 640 configured to predict a remaining propulsion path from the corrected propulsion location to a preselected propulsion destination, a remaining propulsion time, and a target propulsion leg at a bifurcation based on the centerline network and the bifurcation prediction;
The display module 650 is configured to mark the target propulsion branch and the non-target branch of the bifurcation, and perform propulsion guidance according to the remaining propulsion distance, the remaining propulsion time and the bifurcation marked by the branches.
Optionally, the correction module 620 is configured to:
Determining a target key node corresponding to the propulsion position from key nodes of the central line network corresponding to the region of interest;
and correcting the accumulated error of the propulsion position according to the node position corresponding to the target key node to obtain the corrected propulsion position.
Optionally, the prediction module 640 is configured to:
Predicting a remaining propulsion path along the centerline network from the corrected propulsion location to a preselected propulsion destination if the bifurcation prediction characterizes the presence of a bifurcation ahead of propulsion of the endoscope;
Predicting the residual propulsion time from the corrected propulsion position to a preselected propulsion end according to the residual propulsion distance and the average speed of the endoscope in the current propulsion process;
A target propulsion branch from the corrected propulsion position to a preselected propulsion destination is determined from each branch of the bifurcation characterized by the bifurcation prediction.
Optionally, the correction module 620 is further configured to establish a centerline network corresponding to the region of interest by:
constructing a three-dimensional model aiming at the region of interest according to an original medical image of the region of interest to be checked;
image segmentation is carried out on the three-dimensional model to obtain a pipeline network corresponding to a pipeline system of each pipeline type;
Extracting a central line of a pipeline network corresponding to each pipeline type to obtain a central line network corresponding to the pipeline network of each pipeline type, wherein the line width of the central line in the central line network is a volume element, and a unique passage exists between any two volume elements on the central line.
Optionally, the correction module 620 is further configured to:
defining a target region including the region of interest in the three-dimensional stereomodel;
According to the input exposure value range, scanning volume elements of the three-dimensional model in the target area to obtain volume elements corresponding to the target area;
searching volume elements with communication relations in the three-dimensional model by taking input seed points as starting points according to the exposure value range, and adding display labels to the volume elements with the communication relations;
hiding the volume elements without display labels in the three-dimensional model to obtain a pipeline network corresponding to the pipeline system of each pipeline type.
Optionally, the presentation module 650 is configured to:
Marking the target propulsion branch of the guiding bifurcation by presetting a first color and a first preset marking pattern;
and marking the non-target pushing branch of the guide bifurcation by presetting a second color and a second preset marking pattern.
The detailed manner in which the respective modules perform the operations in relation to the endoscope advance guide device in the above-described embodiment has been described in detail in relation to the embodiment of the method, and will not be explained in detail here.
It will be appreciated by those skilled in the art that the above-described embodiments of the apparatus are merely illustrative, and that the division of modules is merely a logical function division, and that other divisions may be implemented, for example, multiple modules may be combined or integrated into a single module. Further, the modules illustrated as separate components may or may not be physically separate. Also, each module may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. When implemented in hardware, may be implemented in whole or in part in the form of an integrated circuit or chip.
The presently disclosed embodiments also provide a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a first processor, implements the steps of the endoscope advancement guide method of any of the preceding embodiments.
The embodiment of the disclosure also provides an electronic device, including:
A first memory having a computer program stored thereon;
A second processor for executing the computer program in the first memory to implement the steps of the endoscope advancement guidance method of any of the preceding embodiments.
Fig. 7 is a block diagram of an electronic device 700, according to an example embodiment. The electronic device 700 may be configured as an endoscope advancement guide, as shown in fig. 7, and the electronic device 700 may include a third processor 701, a third memory 702. The electronic device 700 may also include one or more of a multimedia component 703, an input/output (I/O) interface 704, and a communication component 705.
Wherein the third processor 701 is configured to control the overall operation of the electronic device 700 to perform all or part of the steps of the endoscope advancement guidance method described above. The third memory 702 is used to store various types of data to support operation on the electronic device 700, such as, for example, instructions for any application or method operating on the electronic device 700, as well as application-related data, such as pictures and the like. The third Memory 702 may be implemented by any type or combination of volatile or non-volatile Memory devices, such as static random access Memory (Static Random Access Memory, SRAM for short), electrically erasable programmable Read-Only Memory (ELECTRICALLY ERASABLE PROGRAMMABLE READ-Only Memory, EPROM for short), programmable Read-Only Memory (Programmable Read-Only Memory, PROM for short), read-Only Memory (ROM for short), magnetic Memory, flash Memory, magnetic disk or optical disk. The multimedia component 703 can include a screen and an audio component. Wherein the screen may be, for example, a touch screen, the audio component being for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may be further stored in a third memory 702 or transmitted through a communication component 705. The audio assembly further comprises at least one speaker for outputting audio signals. The I/O interface 704 provides an interface between the third processor 701 and other interface modules, which may be a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons for operating the endoscope advancement guide. The communication component 705 is for wired or wireless communication between the electronic device 700 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, near Field Communication (NFC) for short, 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or one or a combination of more of them, is not limited herein. The communication component 705 accordingly may comprise a Wi-Fi module, a bluetooth module, an NFC module, etc.
In an exemplary embodiment, the electronic device 700 may be implemented by one or more Application-specific integrated circuits (ASIC), digital signal Processor (DIGITAL SIGNAL Processor, DSP), digital signal processing device (DIGITAL SIGNAL Processing Device, DSPD), programmable logic device (Programmable Logic Device, PLD), field programmable gate array (Field Programmable GATE ARRAY, FPGA), controller, microcontroller, microprocessor, or other electronic component for performing the above-described endoscope advancement guidance method.
In another exemplary embodiment, a computer readable storage medium is also provided that includes program instructions that, when executed by a processor, implement the steps of the endoscope advancement guide method described above. For example, the computer readable storage medium may be the third memory 702 including program instructions described above, which are executable by the third processor 701 of the electronic device 700 to perform the endoscope advancement guidance method described above.
The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the present disclosure is not limited to the specific details of the embodiments described above, and various simple modifications may be made to the technical solutions of the present disclosure within the scope of the technical concept of the present disclosure, and all the simple modifications belong to the protection scope of the present disclosure.
In addition, the specific features described in the above embodiments may be combined in any suitable manner without contradiction. The various possible combinations are not described further in this disclosure in order to avoid unnecessary repetition.
Moreover, any combination between the various embodiments of the present disclosure is possible as long as it does not depart from the spirit of the present disclosure, which should also be construed as the disclosure of the present disclosure.
Claims (8)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311201443.2A CN119632476B (en) | 2023-09-15 | 2023-09-15 | Endoscopic guidance methods, devices, storage media and equipment |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311201443.2A CN119632476B (en) | 2023-09-15 | 2023-09-15 | Endoscopic guidance methods, devices, storage media and equipment |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN119632476A CN119632476A (en) | 2025-03-18 |
| CN119632476B true CN119632476B (en) | 2025-11-25 |
Family
ID=94946752
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202311201443.2A Active CN119632476B (en) | 2023-09-15 | 2023-09-15 | Endoscopic guidance methods, devices, storage media and equipment |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN119632476B (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009056239A (en) * | 2007-09-03 | 2009-03-19 | Olympus Medical Systems Corp | Endoscope device |
| WO2023124981A1 (en) * | 2021-12-31 | 2023-07-06 | 杭州堃博生物科技有限公司 | Motion navigation method, apparatus, and device for bronchoscope |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3850217B2 (en) * | 2000-12-27 | 2006-11-29 | オリンパス株式会社 | Endoscope position detector for bronchi |
| WO2005058137A2 (en) * | 2003-12-12 | 2005-06-30 | University Of Washington | Catheterscope 3d guidance and interface system |
| CN113855242B (en) * | 2021-12-03 | 2022-04-19 | 杭州堃博生物科技有限公司 | Bronchoscope position determination method, device, system, equipment and medium |
| CN116416197A (en) * | 2021-12-31 | 2023-07-11 | 杭州堃博生物科技有限公司 | Method and device for identifying bifurcation of lung segment, electronic device and computer readable storage medium |
-
2023
- 2023-09-15 CN CN202311201443.2A patent/CN119632476B/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009056239A (en) * | 2007-09-03 | 2009-03-19 | Olympus Medical Systems Corp | Endoscope device |
| WO2023124981A1 (en) * | 2021-12-31 | 2023-07-06 | 杭州堃博生物科技有限公司 | Motion navigation method, apparatus, and device for bronchoscope |
Also Published As
| Publication number | Publication date |
|---|---|
| CN119632476A (en) | 2025-03-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR101875468B1 (en) | Method and apparatus for providing medical information service based on diesease model | |
| JP3820244B2 (en) | Insertion support system | |
| CN104883950B (en) | Endoscopic system | |
| JP4922107B2 (en) | Endoscope device | |
| US12026874B2 (en) | Systems and methods for detection and analysis of polyps in colon images | |
| CN110604616B (en) | Interventional operation path planning method and system based on graph search and electronic equipment | |
| JPWO2007129493A1 (en) | Medical image observation support device | |
| CN112884826A (en) | Method and device for extracting center line of blood vessel | |
| JP2006246941A (en) | Image processing apparatus and tube travel tracking method | |
| JP5094775B2 (en) | Case image retrieval apparatus, method and program | |
| CN116433874B (en) | Bronchoscope navigation method, device, equipment and storage medium | |
| CN119632476B (en) | Endoscopic guidance methods, devices, storage media and equipment | |
| KR20210101640A (en) | Method and apparatus to provide guide data for device to insert medical tool into vascular | |
| JP5123615B2 (en) | Endoscope insertion support device | |
| CN106725853B (en) | Navigation path planning device | |
| KR102149167B1 (en) | Artificial intelligence based cannula surgery diagnostic apparatus | |
| CN114283261A (en) | Path planning method of virtual endoscope and related product | |
| CN117711635B (en) | Medical image inspection result analysis method and device | |
| JP2006020874A (en) | Endoscope insertion support device | |
| CN119632672B (en) | Endoscope propulsion route planning method, device, storage medium and equipment | |
| CN117974732A (en) | Bronchoscope-based data registration method and system | |
| CN116501837A (en) | Retrieval method, system, device and storage medium based on twin-tower recall | |
| CN114266831A (en) | Data processing method, device, equipment, medium and system for assisting operation | |
| CN116128806A (en) | Recognition and location of small bowel lesions in capsule endoscopy based on deep neural network | |
| JP4575143B2 (en) | Insertion support system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant |