EP4588006A1 - Détermination d'une distance physique de contour à l'intérieur d'un sujet sur la base d'un modèle tridimensionnel déformable - Google Patents
Détermination d'une distance physique de contour à l'intérieur d'un sujet sur la base d'un modèle tridimensionnel déformableInfo
- Publication number
- EP4588006A1 EP4588006A1 EP23782671.4A EP23782671A EP4588006A1 EP 4588006 A1 EP4588006 A1 EP 4588006A1 EP 23782671 A EP23782671 A EP 23782671A EP 4588006 A1 EP4588006 A1 EP 4588006A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- scene
- point
- deformable
- model
- dynamic measurement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/579—Depth or shape recovery from multiple images from motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/08—Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20101—Interactive definition of point of interest, landmark or seed
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Definitions
- An illustrative system includes a memory storing instructions and one or more processors communicatively coupled to the memory.
- the one or more processors may be configured to execute the instructions to perform a process comprising: generating, based on imagery of a scene, a deformable three-dimensional (3D) model of the scene; identifying a first point on an anatomical object located in the scene; and determining, based on the deformable 3D model, a dynamic measurement value representative of a contour physical distance between the first point and a second point in the scene.
- the dynamic measurement value may dynamically update with movement of one or more objects within the scene.
- An illustrative method includes generating, by at least one computing device and based on imagery of a scene, a deformable 3D model of the scene; identifying, by the at least one computing device, a first point on an anatomical object located in the scene; and determining, by the at least one computing device and based on the deformable 3D model, a dynamic measurement value representative of a contour physical distance between the first point and a second point in the scene, wherein the dynamic measurement value dynamically updates with movement of one or more objects within the scene.
- An illustrative non-transitory computer-readable medium may store instructions that, when executed, direct a processor of a computing device to perform a process comprising: generating, based on imagery of a scene, a deformable 3D model of the scene; identifying a first point on an anatomical object located in the scene; and determining, based on the deformable 3D model, a dynamic measurement value representative of a contour physical distance between the first point and a second point in the scene, wherein the dynamic measurement value dynamically updates with movement of one or more objects within the scene.
- FIG. 1 shows an illustrative implementation including a dynamic measurement system.
- FIG. 2 shows another illustrative implementation including a dynamic measurement system.
- FIG. 3 shows an illustrative method of operating a dynamic measurement system.
- FIG. 4 shows another illustrative method of operating a dynamic measurement system.
- FIGS. 5A and 5B show illustrative implementations of generating a deformable 3D model using a dynamic measurement system.
- FIG. 7 shows an illustrative implementation of a display that may be generated using a dynamic measurement system.
- FIG. 9 shows an illustrative computing system according to principles described herein.
- An illustrative dynamic measurement system may be configured to determine a dynamic measurement of a contour physical distance between points within a scene based on a deformable 3D model of the scene.
- the dynamic measurement system may be configured to generate, based on imagery (e.g., as captured by an imaging device) of a scene (e.g., an area within a subject of a medical procedure), a deformable 3D model of the scene.
- the dynamic measurement system may further be configured to identify a first point on an anatomical object located in the scene and determine, based on the deformable 3D model, a dynamic measurement value representative of a contour physical distance between the first point and a second point in the scene.
- the deformable 3D model may be generated in real-time during the medical procedure based on the imagery of the scene. This may allow the deformable 3D model to depict movement of one or more objects within the scene as the one or more objects deform (e.g., due to breathing and/or force applied by an instrument during a medical procedure), which may cause the contour physical distance to change. Accordingly, the dynamic measurement value may dynamically update with movement of the one or more objects within the scene based on the deformable 3D model.
- the determination of a dynamic measurement value based on a deformable 3D model may allow the dynamic measurement value to be determined more accurately and/or efficiently.
- the determination of the dynamic measurement value based on a deformable 3D model may account for surface contours of anatomical objects, which may decrease occlusion issues caused by the surface contours, and/or account for anatomical objects located outside of a field of view of an imaging device, which may increase an area for the determination of the dynamic measurement value.
- the determination of the dynamic measurement value based on a deformable 3D model may allow the dynamic measurement value to be dynamically updated, such as while one or more anatomical objects within the scene are deformed.
- Memory 104 may store and/or otherwise maintain executable data used by processor 106 to perform any of the functionality described herein.
- memory 104 may store instructions 108 that may be executed by processor 106.
- Memory 104 may be implemented by one or more memory or storage devices, including any memory or storage devices described herein, that are configured to store data in a transitory or non-transitory manner.
- Instructions 108 may be executed by processor 106 to cause dynamic measurement system 102 to perform any of the functionality described herein.
- Instructions 108 may be implemented by any suitable application, software, code, and/or other executable data instance.
- memory 104 may also maintain any other data accessed, managed, used, and/or transmitted by processor 106 in a particular implementation.
- the imagery may include image data (e.g., color, grayscale, saturation, intensity, brightness, depth, etc.) captured by imaging device 204.
- the image data may, in some instances, be associated with data points expressed in a common coordinate frame such as 3D voxels or two-dimensional (2D) pixels of images captured by imaging device 204.
- imaging device 204 may be moved relative to scene 208 to capture imagery of scene 208 at different viewpoints.
- Scene 208 may include an environment (e.g., an area within a subject of a medical procedure) and/or one or more objects within an environment.
- scene 208 may include an anatomical object 210.
- Anatomical object 210 may include an object associated with a subject (e.g., a body of a live animal, a human or animal cadaver, a portion of human or animal anatomy, tissue removed from human or animal anatomies, non-tissue work pieces, training models, etc.).
- anatomical object 210 may include tissue of a subject (e.g., an organ, soft tissue, connective tissue, etc.).
- Non-anatomical objects may be included within scene 208, such as physical tools (e.g., scalpels, scissors, forceps, clamps, etc.) and/or other objects (e.g., staples, mesh, sponges, etc.) used for a medical procedure.
- physical tools e.g., scalpels, scissors, forceps, clamps, etc.
- other objects e.g., staples, mesh, sponges, etc.
- Dynamic measurement system 202 may implement or be similar to dynamic measurement system 102 and may be configured to receive imagery of scene 208 from imaging device 204.
- dynamic measurement system 202 may be configured to fuse imagery of scene 208 captured by imaging device 204 at different viewpoints of scene 208.
- the fusing may include merging aligned (or overlapping) voxels or pixels, such as by blending intensity and/or depth values for aligned voxels or pixels.
- the blending may include weighted blending in which the data points being blended are weighted based on one or more factors, such as which camera of a stereoscopic device has the best view of a data point (e.g., by more heavily weighting data captured by the camera with the best viewing angle).
- the fusing may additionally or alternatively include stitching non-overlapping voxels or pixels together, such as by stitching images together along non-overlapping boundaries of the images. Accordingly, the fusing of imagery at different viewpoints may allow the imagery of scene 208 to include an area that is larger than a single field of view of imaging device 204.
- dynamic measurement system 202 includes a deformable 3D model generator 212 configured to generate a deformable 3D model 214 based on imagery of scene 208.
- deformable 3D model generator 212 may be configured to generate a point cloud having a plurality of nodes representative of surface points on one or more objects (e.g., anatomical object 210) within scene 208 as depicted in imagery captured by imaging device 204.
- Deformable 3D model generator 212 may further be configured to generate vertices associated with 3D locations that correspond to 3D locations of the plurality of nodes from the imagery.
- deformable 3D model generator 212 may be configured to determine a depth associated with the plurality of nodes, such as by processing stereoscopic images captured by imaging device 204.
- a depth map of scene 208 may be generated using a depth sensor.
- Deformable 3D model generator 212 may further be configured to deform deformable 3D model 214 over time with the movement of one or more objects within scene 208.
- anatomical object 210 within scene 208 may be deformed during a medical procedure. Such deformation may be caused by the expansion and/or contraction of anatomical object 210 (e.g., while a subject of a medical procedure is breathing), by movement of another object (e.g., one physically connected to anatomical object 210) in scene 208, and/or by a force applied to anatomical object 210 (e.g., by a physical tool, a human finger, etc.).
- Other non-anatomical objects may move within scene 208 in addition to or instead of anatomical object 210 (e.g., a physical tool may move relative to anatomical object 210).
- the 3D locations of the vertices of deformable 3D model 214 may track the 3D locations of the plurality of nodes associated with the vertices as the 3D locations of the plurality of nodes update in the imagery captured by imaging device 204 with the movement of the one or more objects within scene 208. This may allow deformable 3D model 214 to deform over time with the movement of the one or more objects within scene 208.
- deformable 3D model generator 212 may be configured to detect deformation of deformable 3D model 214, such as by comparing the 3D locations of the vertices of deformable 3D model 214 at two or more different points of time.
- a first 3D model which may be deformable or nondeformable
- a second 3D model which may be deformable or nondeformable
- the first and second 3D models may be compared with each other to detect deformation
- a simultaneous localization and mapping (SLAM) heuristic may be used by deformable 3D model generator 212 to construct and/or update a map of scene 208 while simultaneously keeping track of the location of objects within scene 208.
- the SLAM heuristic may be configured to generate the point cloud having the plurality of nodes representative of surface points on one or more objects within scene 208 and derive and/or associate vertices of deformable 3D model 214 with 3D locations that correspond to 3D locations of the plurality of nodes as imaging device 204 views scene 208 in real-time.
- deformable 3D model generator 212 may be configured to generate deformable 3D model 214 based on preoperative imagery of scene 208.
- the movement of one or more objects within scene 208 may be determined based on kinematic data representative of movement of the one or more objects over time.
- the kinematic data may be generated by or associated with a computer-assisted medical system communicatively coupled with the one or more objects (e.g., a physical tool).
- Dynamic measurement system 202 further includes a contour physical distance module 216 configured to determine, based on deformable 3D model 214, a dynamic measurement value representative of a contour physical distance along scene 208 and output the dynamic measurement value to user interface 206.
- the contour physical distance may be representative of a distance between two or more points within scene 208 that may extend over a physical surface of one or more objects within scene 208.
- the dynamic measurement value may be represented by any suitable value, such as a discrete value (e.g., a distance, a range, a percentage, etc.) representative of the contour physical distance.
- contour physical distance module 216 may be configured to determine a 3D contour that may extend along a surface of one or more objects of the deformable 3D model 214 and connect the two or more points within scene 208 such that the 3D contour may be representative of the contour physical distance between the two or more points.
- the two or more points may be associated with vertices of deformable 3D model 214.
- contour physical distance module 216 may be configured to identify one or more additional vertices of deformable 3D model 214 between the two or more points on the 3D contour.
- contour physical distance module 216 may determine intermediate distances for each segment of a linear-segmented route that passes through the 3D locations of each adjacent vertex of deformable 3D model 214. Based on the intermediate distances, contour physical distance module 216 may compute the dynamic measurement value as a sum of the intermediate distances. The sum of the intermediate distances may provide an estimation for an exact contour physical distance, which may become more accurate as more vertices and/or intermediate distances are defined. Additionally or alternatively, contour physical distance module 216 may determine a direct point-to-point distance between the 3D locations of each point of the two or more points.
- User interface 206 may be configured to receive the dynamic measurement value from dynamic measurement system 202.
- User interface 206 of the illustrated implementation includes a display device 218.
- Display device 218 may be implemented by a monitor or other suitable device configured to display information to a user.
- display device 218 may be configured to display the dynamic measurement value received from dynamic measurement system 202.
- display device 218 may further be configured to display imagery of scene 208 captured by imaging device 204 and/or deformable 3D model 214 generated by dynamic measurement system 202.
- user interface 206 may include any suitable device (e.g., a button, joystick, touchscreen, keyboard, handle, etc.) configured to receive a user input such as to identify points within scene 208 for determining the dynamic measurement value.
- dynamic measurement system 202 may be configured to determine multiple dynamic measurements within scene 208.
- dynamic measurement system 202 may be configured to determine a distance between a physical tool and multiple anatomical objects 210 within scene 208.
- dynamic measurement system 202 may be configured to mark, track, and/or present the multiple dynamic measurements.
- dynamic measurement system 202 may be configured to mark (e.g., highlight) the physical tool, the multiple anatomical objects 210, and/or distances between the physical tool and multiple anatomical objects 210 (e.g., on a display of display device 218).
- Dynamic measurement system 202 may further be configured to track and update the multiple dynamic measurements as the physical tool is moved relative to the multiple anatomical objects 210. Dynamic measurement system 202 may further be configured to present (e.g., label) the multiple dynamic measurements to a user (e.g., on a display of display device 218).
- dynamic measurement system 202 may, at operation 302, generate, based on imagery of scene 208, deformable 3D model 214 of scene 208.
- dynamic measurement system 202 may generate a point cloud having a plurality of nodes representative of surface points on one or more objects (e.g., anatomical object 210) within scene 208 and derive vertices associated with 3D locations that correspond to 3D locations of the plurality of nodes (e.g., using a SLAM heuristic).
- dynamic measurement system 202 may further identify a second point in scene 208.
- the second point may be spaced a distance away from the first point in scene 208.
- the second point may be identified on the same anatomical object 210 as the first point, on a different anatomical object 210 from the first point, on a non-anatomical object (e.g., a physical tool) within scene 208, and/or another area within scene 208.
- the second point may correspond to another feature on anatomical object 210 (e.g., an opposing edge of a hernia).
- the first and second points are spaced sufficiently apart that one may be located outside of a field of view of the imaging device.
- the identifying the second point may include implementing and applying artificial intelligence algorithms, such as machine learning algorithms, to designate the second point in scene 208.
- artificial intelligence algorithms such as machine learning algorithms
- Any suitable form of artificial intelligence and/or machine learning may be used, including, for example, deep learning, neural networks, etc.
- a machine learning algorithm may be generated through machine learning procedures and applied to identification operations.
- the machine learning algorithm may be directed to identifying an object and/or a feature of an object within scene 208.
- the machine learning algorithm may operate as an identification function that is applied to individual and/or fused imagery to classify an object in the imagery.
- Still other suitable methods may be used for identifying the second point in scene 208 in addition to or instead of machine learning algorithms.
- Dynamic measurement system 202 may be configured to dynamically update the dynamic measurement value with movement of one or more objects within scene 208.
- the 3D locations of the vertices of deformable 3D model 214 associated with at least one of the first point or the second point may move as anatomical object 210 is deformed.
- the 3D locations of the intermediate vertices on the 3D contour extending between the first and second points may update with movement of one or more objects within scene 208.
- These changes in the 3D locations of the vertices may affect the intermediate distances between the vertices. Accordingly, the intermediate distances may be recomputed as the 3D locations of the vertices are updated to dynamically update the dynamic measurement value.
- dynamic measurement system 202 may detect changes of the 3D locations of the vertices on the 3D contour of deformable 3D model 214 and compare the distance between the 3D locations of the vertices at two different points of time. Additionally or alternatively, the change in the distance between the 3D locations of the vertices may be determined by comparing the 3D locations of the vertices in a first 3D model, which may be deformable or nondeformable, generated at a first point of time and with 3D locations of corresponding vertices in a second 3D model, which may be deformable or nondeformable, generated at a second point of time. Still other suitable methods for determining the dynamic measurement value may be used.
- the dynamic measurement value may be dynamically updated as the hernia deforms (e.g., due to breathing and/or insufflation) such that the size of the mesh patch may be selected or adjusted based on the dynamic updates of the dynamic measurement value.
- dynamic measurement system 202 may determine a length of a bowel (e.g., during a lower anterior resection procedure) based on a dynamic measurement value that may be dynamically updated as the length of the bowel is stretched and/or compressed.
- the dynamic measurement value may further provide a reference for a size of a lung (e.g., during a thoracic surgery).
- the dynamic measurement value may be dynamically updated as the lung is deformed.
- dynamic measurement system 202 may be configured to determine, based on the dynamic measurement value, how far away a tip of a physical tool is from anatomical object 210 during a medical procedure.
- dynamic measurement system 202 may be configured to track changes (e.g., dynamic updates) of the dynamic measurement value (e.g., due to movement of the one or more objects within scene 208). For example, dynamic measurement system 202 may determine a change of the dynamic measurement value, such as by determining a difference between the dynamic measurement value and a previous dynamic measurement value. The change of the dynamic measurement value may indicate an effectiveness and/or progress of a surgical step (e.g., insufflation).
- changes e.g., dynamic updates
- dynamic measurement system 202 may determine a change of the dynamic measurement value, such as by determining a difference between the dynamic measurement value and a previous dynamic measurement value.
- the change of the dynamic measurement value may indicate an effectiveness and/or progress of a surgical step (e.g., insufflation).
- FIG. 4 shows another illustrative method 400 that may be performed by dynamic measurement system 202. While FIG. 4 illustrates exemplary operations according to one embodiment, other embodiments may omit, add to, reorder, and/or modify any of the operations shown in FIG. 4. Moreover, each of the operations depicted in FIG. 4 may be performed in any of the ways described herein.
- dynamic measurement system 202 may, at operation 402, generate, based on imagery of scene 208 (e.g., as captured by imaging device 204), a point cloud having a plurality of nodes representative of surface points on one or more objects (e.g., anatomical object 210) within scene 208. Dynamic measurement system 202 may further, at operation 404, generate deformable 3D model 214 having vertices with 3D locations associated with the 3D locations of the plurality of nodes.
- Dynamic measurement system 202 may further, at operation 406, identify two or more points within scene 208. In some implementations, these points may be identified by receiving a user input (e.g., via user interface 206) designating the two or more points on the imagery of scene 208 and/or deformable 3D model 214. In some implementations, at least one of the two or more points may be located on anatomical object 210 within scene 208. The remaining point(s) may be positioned on the same anatomical object 210, a different anatomical object 210, a physical tool, and/or another area within scene 208.
- Dynamic measurement system 202 may further, at operation 408, identify vertices of deformable 3D model 214 that form a 3D contour connecting the two or more points. For example, dynamic measurement system 202 may identify vertices of deformable 3D model 214 having 3D locations that correspond to the identified two or more points within scene 208. Dynamic measurement system 202 may further identify one or more additional vertices of deformable 3D model 214 having 3D locations positioned between the vertices corresponding to the identified two or more points. The 3D contour may be formed to connect the identified vertices such that the 3D contour may extend along a surface of deformable 3D model 214 between the identified two or more points.
- Dynamic measurement system 202 may further, at operation 410, determine, based on the 3D contour, a dynamic measurement value representative of a contour physical distance between the two or more points.
- dynamic measurement system 202 may be configured to sum the distances between the 3D locations of adjacent vertices on the 3D contour of deformable 3D model 214 to determine the dynamic measurement value.
- Dynamic measurement system 202 may update the 3D locations of the vertices of deformable 3D model 214 with updated 3D locations of the corresponding nodes and determine whether the 3D locations of any of the identified vertices on the 3D contour of deformable 3D model 214 have moved.
- dynamic measurement system 202 may, at operation 414, update the dynamic measurement value. For example, dynamic measurement system 202 may recompute the sum of the distances between the updated 3D locations of the vertices on the 3D contour. If one or more vertices have not moved (no, at operation 412), dynamic measurement system 202, may continue to monitor for movement of the one or more vertices on the 3D contour. In some implementations, the operation 412 of determining whether movement of one or more vertices on the 3D contour of deformable 3D model 214 has occurred may be omitted. For example, the dynamic measurement value may be updated and/or recomputed at select intervals without determining whether movement of one or more vertices on the 3D contour of deformable 3D model 214 has occurred.
- FIGS. 5A-6B show an illustrative example of determining a dynamic measurement value that may be performed by dynamic measurement system 202.
- FIG. 5A shows an implementation 500 of imagery 502 of a scene (e.g., scene 208) that may be captured by imaging device 204.
- imagery 502 includes a plurality of anatomical objects 504 (e.g., anatomical objects 504-1 to 504-3) and a physical tool 506 spaced away from the plurality of anatomical objects 504 in an environment 508 (e.g., an area within a subject of a medical procedure).
- anatomical objects 504 e.g., anatomical objects 504-1 to 504-3
- a physical tool 506 spaced away from the plurality of anatomical objects 504 in an environment 508 (e.g., an area within a subject of a medical procedure).
- FIG. 6A shows an implementation 600 of two or more points 602 (e.g., points 602-1 to 602-2) identified on deformable 3D model 514.
- the two or more points 602 may be identified on imagery 502 and/or deformable 3D model 514, such as by receiving a user input (e.g., via user interface 206) designating the two or more points 602 on imagery 502 and/or deformable 3D model 514.
- a first point 602-1 has been designated on a first anatomical object 504-1 and a second point 602-2 has been designated on a third anatomical object 504-3 that is spaced away from first anatomical object 504-1 by a second anatomical object 504-2.
- the identified points 602 may be associated with vertices 516 of deformable 3D model 514 having 3D locations that correspond to the identified points 602.
- Dynamic measurement system 202 may identify one or more intermediate vertices 604 (e.g., vertices 604-1 to 604-n) of deformable 3D model 514 between the identified points 602.
- intermediate vertices 604 may include vertices 516 of deformable 3D model 514 having 3D locations positioned along a surface of deformable 3D model 514 between the identified points 602.
- intermediate vertices 604 extend along a surface of first anatomical object 504-1 , second anatomical object 504-2, and third anatomical object 504-3 from first point 602-1 to second point 602-2.
- Dynamic measurement system 202 may further form a 3D contour 606 connecting the identified points 602 through intermediate vertices 604.
- 3D contour 606 may be formed by connecting first point 602-1 and second point 602-2 through intermediate vertices 604 such that 3D contour 606 extends along the surfaces of first anatomical object 504-1 , second anatomical object 504-2, and third anatomical object 504-3 from first point 602-1 to second point 602-2.
- Dynamic measurement system 202 may thereby update the dynamic measurement value, such as by recomputing the sum of the distances between the updated 3D locations of first point 602-1 , intermediate vertices 604, and second point 602-2 on 3D contour 606.
- FIG. 7 shows an illustrative implementation 700 of a display 702 that may be displayed on display device 218.
- display 702 includes a first display view 704 and a second display view 706.
- First display view 704 may display imagery 708 of scene 208 as captured by imaging device 204.
- imagery 708 depicts an anatomical object 710.
- Second display view 706 may display a deformable 3D model 712 of anatomical object 710 that may be generated by dynamic measurement system 202 based on imagery 708.
- Deformable 3D model 712 may implement or be similar to deformable 3D model 214 and/or deformable 3D model 514.
- first display view 704 and second display view 706 may be uncoupled such that deformable 3D model 712 in second display view 706 may move (e.g., pan, rotate, zoom, etc.) independently from imagery 708 in first display view 704.
- deformable 3D model 712 in second display view 706 may move (e.g., pan, rotate, zoom, etc.) independently from imagery 708 in first display view 704.
- a user may provide a user input to move deformable 3D model 712 within second display view 706 without moving imaging device 204 and/or changing imagery 708 displayed in first display view 704. This may allow the user to view a larger area and/or a different viewpoint of anatomical object 710 in deformable 3D model 712 without moving imaging device 204 and/or changing imagery 708 displayed in first display view 704.
- the medical team may include a first user 810-1 (such as a surgeon for a surgical procedure), a second user 810-2 (such as a patient-side assistant), a third user 810-3 (such as another assistant, a nurse, a trainee, etc.), and a fourth user 810-4 (such as an anesthesiologist for a surgical procedure), all of whom may be collectively referred to as users 810, and each of whom may control, interact with, or otherwise be a user of computer-assisted medical system 800. More, fewer, or alternative users may be present during a medical procedure as may serve a particular implementation. For example, team composition for different medical procedures, or for non-medical procedures, may differ and include users with different roles.
- FIG. 8 illustrates an ongoing minimally invasive medical procedure such as a minimally invasive surgical procedure
- computer- assisted medical system 800 may similarly be used to perform open medical procedures or other types of operations. For example, operations such as exploratory imaging operations, mock medical procedures used for training purposes, and/or other operations may also be performed.
- manipulator assembly 802 may include one or more manipulator arms 812 (e.g., manipulator arms 812-1 through 812-4) to which one or more instruments may be coupled.
- the instruments may be used for a computer- assisted medical procedure on patient 808 (e.g., in a surgical example, by being at least partially inserted into patient 808 and manipulated within patient 808).
- manipulator assembly 802 is depicted and described herein as including four manipulator arms 812, it will be recognized that manipulator assembly 802 may include a single manipulator arm 812 or any other number of manipulator arms as may serve a particular implementation. While the example of FIG.
- manipulator arms 812 as being robotic manipulator arms
- one or more instruments may be partially or entirely manually controlled, such as by being handheld and controlled manually by a person.
- these partially or entirely manually controlled instruments may be used in conjunction with, or as an alternative to, computer-assisted instrumentation that is coupled to manipulator arms 812 shown in FIG. 8.
- user control apparatus 804 may be configured to facilitate teleoperational control by user 810-1 of manipulator arms 812 and instruments attached to manipulator arms 812. To this end, user control apparatus 804 may provide user 810-1 with imagery of an operational area associated with patient 808 as captured by an imaging device. To facilitate control of instruments, user control apparatus 804 may include a set of master controls. These master controls may be manipulated by user 810-1 to control movement of the manipulator arms 812 or any instruments coupled to manipulator arms 812.
- Manipulator assembly 802, user control apparatus 804, and auxiliary apparatus 806 may be communicatively coupled to another in any suitable manner.
- manipulator assembly 802, user control apparatus 804, and auxiliary apparatus 806 may be communicatively coupled by way of control lines 816, which may represent any wired or wireless communication link as may serve a particular implementation.
- manipulator assembly 802, user control apparatus 804, and auxiliary apparatus 806 may each include one or more wired or wireless communication interfaces, such as one or more local area network interfaces, Wi-Fi network interfaces, cellular interfaces, and so forth.
- one or more of the processes described herein may be implemented at least in part as instructions embodied in a non-transitory computer- readable medium and executable by one or more computing devices.
- a processor e.g., a microprocessor
- receives instructions from a non-transitory computer-readable medium, (e.g., a memory, etc.), and executes those instructions, thereby performing one or more processes, including one or more of the processes described herein.
- Such instructions may be stored and/or transmitted using any of a variety of known computer-readable media.
- a computer-readable medium includes any non-transitory medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer).
- a medium may take many forms, including, but not limited to, non-volatile media, and/or volatile media.
- Non-volatile media may include, for example, optical or magnetic disks and other persistent memory.
- Volatile media may include, for example, dynamic random access memory (“DRAM”), which typically constitutes a main memory.
- DRAM dynamic random access memory
- Computer-readable media include, for example, a disk, hard disk, magnetic tape, any other magnetic medium, a compact disc read-only memory (“CD- ROM”), a digital video disc (“DVD”), any other optical medium, random access memory (“RAM”), programmable read-only memory (“PROM”), electrically erasable programmable read-only memory (“EPROM”), FLASH-EEPROM, any other memory chip or cartridge, or any other tangible medium from which a computer can read.
- CD- ROM compact disc read-only memory
- DVD digital video disc
- RAM random access memory
- PROM programmable read-only memory
- EPROM electrically erasable programmable read-only memory
- FLASH-EEPROM any other memory chip or cartridge, or any other tangible medium from which a computer can read.
- Processor 904 generally represents any type or form of processing unit capable of processing data and/or interpreting, executing, and/or directing execution of one or more of the instructions, processes, and/or operations described herein.
- Processor 904 may perform operations by executing computer-executable instructions 912 (e.g., an application, software, code, and/or other executable data instance) stored in storage device 906.
- computer-executable instructions 912 e.g., an application, software, code, and/or other executable data instance
- I/O module 908 may include one or more I/O modules configured to receive user input and provide user output.
- I/O module 908 may include any hardware, firmware, software, or combination thereof supportive of input and output capabilities.
- I/O module 908 may include hardware and/or software for capturing user input, including, but not limited to, a keyboard or keypad, a touchscreen component (e.g., touchscreen display), a receiver (e.g., an RF or infrared receiver), motion sensors, and/or one or more input buttons.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Image Processing (AREA)
Abstract
Un système de mesure dynamique donné à titre d'exemple peut être configuré pour générer, sur la base d'une imagerie d'une scène, un modèle 3D déformable de la scène, identifier un premier point sur un objet anatomique situé dans la scène, et déterminer, sur la base du modèle 3D déformable, une valeur de mesure dynamique représentative d'une distance physique de contour entre le premier point et un second point dans la scène. La valeur de mesure dynamique est mise à jour dynamiquement avec le mouvement d'un ou de plusieurs objets à l'intérieur de la scène.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202263405561P | 2022-09-12 | 2022-09-12 | |
| PCT/US2023/032175 WO2024058965A1 (fr) | 2022-09-12 | 2023-09-07 | Détermination d'une distance physique de contour à l'intérieur d'un sujet sur la base d'un modèle tridimensionnel déformable |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP4588006A1 true EP4588006A1 (fr) | 2025-07-23 |
Family
ID=88236454
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP23782671.4A Pending EP4588006A1 (fr) | 2022-09-12 | 2023-09-07 | Détermination d'une distance physique de contour à l'intérieur d'un sujet sur la base d'un modèle tridimensionnel déformable |
Country Status (3)
| Country | Link |
|---|---|
| EP (1) | EP4588006A1 (fr) |
| CN (1) | CN119790433A (fr) |
| WO (1) | WO2024058965A1 (fr) |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110944594A (zh) * | 2017-06-19 | 2020-03-31 | 穆罕默德·R·马赫福兹 | 使用荧光透视和跟踪传感器的髋部外科手术导航 |
| US11219501B2 (en) * | 2019-12-30 | 2022-01-11 | Cilag Gmbh International | Visualization systems using structured light |
-
2023
- 2023-09-07 EP EP23782671.4A patent/EP4588006A1/fr active Pending
- 2023-09-07 WO PCT/US2023/032175 patent/WO2024058965A1/fr not_active Ceased
- 2023-09-07 CN CN202380064832.7A patent/CN119790433A/zh active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| CN119790433A (zh) | 2025-04-08 |
| WO2024058965A1 (fr) | 2024-03-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20250090241A1 (en) | Systems and methods for tracking a position of a robotically-manipulated surgical instrument | |
| US20250054147A1 (en) | Composite medical imaging systems and methods | |
| Grasa et al. | Visual SLAM for handheld monocular endoscope | |
| US11896441B2 (en) | Systems and methods for measuring a distance using a stereoscopic endoscope | |
| USRE49930E1 (en) | Methods and systems for computer-aided surgery using intra-operative video acquired by a free moving camera | |
| Richa et al. | Vision-based proximity detection in retinal surgery | |
| US20230190136A1 (en) | Systems and methods for computer-assisted shape measurements in video | |
| CN114730454A (zh) | 场景感知系统和方法 | |
| US20230410499A1 (en) | Visibility metrics in multi-view medical activity recognition systems and methods | |
| US12450760B2 (en) | Using model data to generate an enhanced depth map in a computer-assisted surgical system | |
| EP4588006A1 (fr) | Détermination d'une distance physique de contour à l'intérieur d'un sujet sur la base d'un modèle tridimensionnel déformable | |
| US20200074737A1 (en) | Visualization of ultrasound images in physical space | |
| WO2025010362A1 (fr) | Détermination d'une distance curviligne à l'intérieur d'un sujet | |
| WO2024072689A1 (fr) | Systèmes et procédés pour déterminer une force appliquée sur un objet anatomique à l'intérieur d'un sujet sur la base d'un modèle tridimensionnel déformable | |
| US12285153B2 (en) | Anatomical scene visualization systems and methods | |
| WO2024186869A1 (fr) | Génération, basée sur la profondeur, d'images de réalité mixte | |
| Penza et al. | Augmented Reality Navigation in Robot-Assisted Surgery | |
| Reiter | Assistive visual tools for surgery |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20250321 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR |