WO2001056491A2 - Systeme et procede de planification informatisee de traitement - Google Patents
Systeme et procede de planification informatisee de traitement Download PDFInfo
- Publication number
- WO2001056491A2 WO2001056491A2 PCT/US2001/003746 US0103746W WO0156491A2 WO 2001056491 A2 WO2001056491 A2 WO 2001056491A2 US 0103746 W US0103746 W US 0103746W WO 0156491 A2 WO0156491 A2 WO 0156491A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- intervention
- computer aided
- treatment planning
- virtual
- planning according
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/50—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H70/00—ICT specially adapted for the handling or processing of medical references
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Instruments for taking body samples for diagnostic purposes; Other methods or instruments for diagnosis, e.g. for vaccination diagnosis, sex determination or ovulation-period determination; Throat striking implements
- A61B10/02—Instruments for taking cell samples or for biopsy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/374—NMR or MRI
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F2/00—Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
- A61F2/02—Prostheses implantable into the body
- A61F2/18—Internal ear or nose parts, e.g. ear-drums
- A61F2002/183—Ear parts
Definitions
- the present invention relates generally to surgical planning and more particularly relates to a system and method of using three dimensional interactive computer visualization in surgical planning, optimization and evaluation.
- the ear is a specialized organ for which the computer aided treatment planning is expected to play a valuable role.
- the ear is an internal organ that is difficult to examine because it is encased in the temporal bone.
- the ear also contains important anatomic structures, including the hearing bones (ossicles), inner ear organs of hearing and balance, and facial nerve.
- Congenital aural atresia (CAA) is a congenital developmental anomaly of the middle ear that manifests with varying degrees of external auditory canal stenosis or atresia, ossicular derangements, poorly developed mastoid and tympanic cavities. This disease results in conductive hearing loss which can be severe. In some cases, however, CAA can be treated surgically. However, because of the complex anatomy of the ear and the risks associated with this surgical procedure, such as facial paralysis, thorough surgical planning is required to assess and maximize the likelihood of successful surgery.
- Preoperative imaging such as by computerized tomography (CT) is considered an important element in surgical planning.
- CT computerized tomography
- Conventional two-dimensional (2D) CT images demonstrate the key anatomic structures of the ear, including the stapes, middle ear space, inner ear and facial nerve.
- the 2D CT is limited in its ability to represent the spatial relationships between these important structures. For example, an aberrant facial nerve, a retro-displaced temporamandibular joint, or a low-lying tegmen tympani might make surgical reconstruction difficult or impossible.
- Three-dimensional (3D) information would be very helpful for planning surgical therapy for treating congenital aural atresia and other forms of ear pathology.
- a method of computer aided treatment planning in accordance with the invention includes generating a 3D image of a region which includes at least one anatomical structure for which treatment is contemplated.
- a virtual intervention is applied to the region of the 3D image to simulate at least a portion of the contemplated treatment.
- the effect of the intervention in terms of both efficacy and potential risk or collateral damage, can be assessed and the intervention can be interactively modified for improved treatment results.
- the intervention plan is finalized, the user can navigate within the 3D image in the area where the intervention is applied to fully visualize the results of the intervention.
- the intervention plan can also be saved in computer readable storage for later use as a "gold standard" for evaluating the completed intervention at later dates.
- the present method is applicable generally to treatment planning, including surgical planning, virtual biopsy, and prosthesis implantation.
- the planning of surgery to correct aural atresia includes the ear and the at least one anatomical structure includes at least one of the temporal bone, facial nerve, and stapes.
- the virtual intervention includes the placement of a virtual cylinder representing the location where an external auditory canal may be formed.
- the proximity of the virtual cylinder to anatomical structures of the ear is measured and a warning can be provided if the proximity is less than a predetermined threshold distance.
- the step of modifying the intervention for this application includes changing the position and/or size of the virtual cylinder.
- a virtual drilling operation can be performed by removing the volume within the virtual cylinder and navigating through the region in the 3D image.
- the stages of placing an intervention, modifying the intervention, virtually performing the intervention and navigating through the results are both interactive and repeatable.
- Each plan can be stored in computer readable media as an entry in a database for future retrieval, examination and comparison.
- the manipulation of the 3D image can be costly and processor intense.
- the step of generating the 3D image of a region includes acquiring a set of 2D images of a region; applying segmentation to extract structures of interest; converting the 2D images to a voxel based dataset of region; storing the voxel based dataset in a partitioned data structure; and rendering the 3D image from the voxel based dataset.
- the partitioned data structure can take the form of a binary space partitioning (BSP) tree having a number of leaf nodes where the voxels of the voxel based dataset are stored.
- BSP binary space partitioning
- the step of applying the intervention can preferably include: identifying those leaf nodes which are effected by the intervention; applying the intervention to the effected leaf nodes to launch voxel based constructive solid geometry (CSG) operation and image regeneration; and re- rendering only the portions of the 3D image associated with the effected leaf nodes.
- CSG constructive solid geometry
- voxel-based constructive solid geometry (CSG) subtraction and level-of-detail rendering can also be used to reduce the processing burden.
- re-rendering can take place based on a simplified level of detail (LOD) rendering mode.
- LOD level of detail
- the CSG operation preferably includes converting the virtual cylinder into a voxelized format and performing a voxel-based CSG subtraction between the virtual cylinder and the tissue bounded by the virtual cylinder volume.
- the method of computer aided treatment planning can be used in connection with numerous contemplated treatments, such as biopsy.
- the intervention is the placement of a virtual biopsy needle into a targeted position where highly suspicious tissue resides within the region.
- a warning can be provided if the virtual biopsy needle is outside the targeted position.
- a warning can also be provided if the proposed path of the biopsy needle will damage anatomical structures in the vicinity.
- the virtual biopsy needle can be modeled as a virtual cylinder.
- the tissue within the volume of the virtual cylinder can be removed from the region, examined in a separate display such as by texture analysis, and geometric features of the tissue, and then reinserted into the region, if desired.
- Virtual biopsy can be used in numerous applications, including the planning for examining pulmonary nodules, prostate tissue abnormality and breast cell calcification.
- Virtual biopsy can differentiate among tissue types using geometric feature analysis, texture analysis and/or tissue density analysis to determine the types of cells which are within the region of tissue extracted by the virtual biopsy needle.
- the virtual intervention includes the removal of damaged tissue in the targeted vicinity of an implant.
- a virtual model of the implant can be optimized from an analysis of the targeted vicinity.
- the pathway for the insertion of the implant can be modeled as a generalized virtual cylinder.
- a warning can be provided if the generalized virtual cylinder placement will damage critical anatomical structures.
- the virtual model can be used to generate the actual insert.
- the resulting plan can be saved in a database for follow post-treatment evaluation of the actual intervention.
- Implants can take on various forms, such as cochlear implants, stent grafts and the like.
- the virtual cylinder is a generalized virtual volume which has the parameters of the tool it is simulating.
- the tool is a drill bit and the virtual cylinder is a simple cylinder with a length, and circular cross section.
- the term virtual cylinder is not so limited. The diameter can vary as can the cross section and contour along the long axis.
- the virtual cylinder may be a deformable model, such as , for example, when the tool being modeled is a flexible catheter.
- the intervention includes the analysis of the plaque component on the arterial wall and the size and removal of plaque in a targeted vicinity.
- a virtual model of the plaque can be optimized from an analysis of the targeted vicinity.
- the pathway for the removal of the plaque can be modeled as a generalized virtual cylinder.
- a warning can be provided if the generalized virtual cylinder has a size or placement which will be ineffective at extracting the plaque if the plaque has a high risk of rupture or dislodgement in the present placement.
- the virtual model can be used to generate the arterial insert which will be used in the actual operation.
- the resulting plan can be saved in a computer database for future reference and post-treatment evaluation.
- a system for computer aided treatment planning includes a scanner, such as a CT or MRI scanner, for acquiring image data from a patient.
- a processor is provided which receives the image data and generates a 3D, voxel based dataset representing at least a portion of the image data.
- a display such as a liquid crystal display (LCD) or Cathode Ray Tube (CRT) is operatively coupled to the processor and provides a 3D representation the image data from the voxel based dataset.
- a user interface device such as a mouse or trackball, is operatively coupled to the processor and allows a user to manipulate the image on the display.
- Computer readable memory is operatively coupled to the processor and has a computer program stored therein.
- the computer program directs the processor to perform the steps of: applying a virtual intervention in the region of the 3D image in response to a signal from the user interface device; analyzing the intervention and automatically generating a warning indication if the intervention results in a high degree of risk arising. Previewing a contemplated treatment virtually, through the user interface, the user can modify the intervention to eliminate the warning.
- Figure 1 is a simplified flow diagram illustrating an overview of present method for computer aided treatment planning
- Figure 2 is a simplified block diagram of a system suitable for performing the present method of computer aided treatment planning
- Figure 3 is a simplified flow diagram illustrating an overview of exemplary rendering and virtual surgery steps used in the present method in the planning of treatment for aural atresia;
- Figure 4 is an exemplary subroutine for generating a binary space partitioning (BSP) tree data structure used for storing and manipulating image data in the present method;
- BSP binary space partitioning
- Figure 5 is a pictorial representation of a geometrical subdivision process of a node in the BSP-tree
- Figure 6 A is a schematic diagram of an exemplary image (illustrated in 2D) partitioned by a number of leaf nodes;
- Figure 6B is a detail of one leaf node partition from Figure 6 A, further illustrating a region of overlap between a virtual cylinder placed in the image and the underlying tissue in the image;
- Figure 7 is a flow chart illustrating a voxel-based constructive solid geometry (CSG) subtraction operation which is employed in the virtual surgery operation of the present method
- Figure 8 is a graphical representation of an exemplary screen display illustrating a 3D rendering of the structures of the ear with background tissue and the temporal bone not shown;
- Figure 9 is a graphical representation of an exemplary screen display illustrating a 3D rendering of the structures of the ear and illustrating the placement of a virtual cylinder as part of a surgical planning procedure for treating aural atresia;
- Figure 10 is a graphical representation of an exemplary screen display illustrating a 3D rendering of the structures of the ear and further illustrating the placement of a virtual cylinder as part of a surgical planning procedure for treating aural atresia; and
- Figure 11 is a graphical representation of an exemplary screen display illustrating a 3D rendering looking down and through the auditory canal that is formed by performing a virtual drilling operation to remove the tissue occupied in the volume of the virtual cylinder.
- Figure 1 is a flow chart which illustrates an overview of the present method of computer aided treatment planning which is generally performed on a computer based system, such as that illustrated in Figure 2.
- the invention will be described in terms of medical applications performed on human patients and in the context of medical treatment, such as surgery, prosthesis implantation, biopsy, medication, therapeutic radiation, therapeutic ultrasound and the like. It will be appreciated, however, that the invention is not limited to human patients, nor to the exemplary list of treatments referenced.
- the term treatment is used to mean an intervention in a region, such as but not limited to tissue, that is intended to effect an alteration of the region.
- the method includes the initial step of generating a three dimensional (3D) image representation of a region for which some form of medical treatment or intervention is contemplated (step 102).
- Generating such a 3D image representation generally involves acquiring a sequential series of 2D images, such as from a spiral computed tomography (CT) scanner or magnetic resonance imaging (MRI) scanner and transforming this 2D image data into a volumetric data set which provides a 3D representation of the region on a 2D display, such as a computer monitor.
- CT computed tomography
- MRI magnetic resonance imaging
- some form of virtual intervention which simulates at least a portion of a proposed treatment, is applied to the 3D image (step 104).
- the virtual intervention can take on several forms, such as the removal of tissue or artery plaques, the repair or reconstruction of a diseased or malformed organ, the placement of a prosthetic implant, the placement of a stent graft, the placement of biopsy needle, the placement of therapeutic radiation and the like.
- the results of the virtual intervention can be evaluated and warnings can be generated indicative of high levels of risk attendant with the proposed intervention (step 106).
- the user can repeatedly modify the proposed intervention (step 108) and apply the modified intervention to the 3D image (step 104) until a satisfactory result is ascertained or it is determined that the proposed treatment is not feasible.
- Several alternative interventions can be saved in a database to compare the risks and efficacy of proposed alternative intervention plans.
- the final intervention can be simulated and the results fully applied to the 3D image (step 110).
- the user can then view the results and navigate in and around the region to determine the efficacy of the proposed treatment (step 112).
- the planned results can then be used as a guide for the actual treatment with coordinate registration between the virtual model and the patient and as a gold standard to evaluate the actual intervention during post- intervention follow up examinations.
- FIG. 2 is a simplified diagram of an exemplary system for performing the present computer aided treatment planning methods.
- a patient 201 lies down on a platform 202 while scanning device 205 scans the area that contains the organ or organs which are to be examined.
- the scanning device 205 contains a scanning portion 203 which acquires image data of the patient and an electronics portion 206.
- Electronics portion 206 generally includes an interface 207, a central processing unit 209, a memory 211 for temporarily storing the scanning data, and a second interface 213 for sending data to the virtual navigation platform.
- Interface 207 and 213 can be included in a single interface component or can even be the same component.
- the various operational components and subsystems in electronics portion 206 are connected together with conventional connectors.
- the data from the scanning portion 203 is generally in the form of a stack of two dimensional image slices of a region of interest, which are provided from conventional spiral computed tomography (CT) and magnetic resonance imaging (MRI) scanners.
- Central processing unit 209 converts the scanned 2D data to a 3D voxel data representation, in a manner known in the art, and stores the results in another portion of memory 211.
- the converted data can also be directly sent to interface unit 213 to be transferred to the virtual navigation terminal 216.
- the conversion of the 2D data could also take place at the virtual navigation terminal 216 after being transmitted from interface 213.
- the converted data is transmitted over carrier 214 to the virtual navigation terminal 216 in order for an operator to perform the computer aided treatment planning.
- the data can also be transported in other conventional ways such as storing the data on a storage medium and physically transporting it to terminal 216 or by using satellite transmissions.
- the scanned data need not be converted to its 3D representation until the visualization rendering engine requires it to be in 3D form. This may save computational steps and memory storage space.
- Virtual navigation terminal 216 includes a screen 217 for viewing the image data, an electronics portion 215 and interface device 219 such as a keyboard, mouse or track ball.
- the electronics portion 215 generally includes a interface port 221, a central processing unit 223, other components 227 necessary to run the terminal and a memory 225.
- the components in terminal 216 are connected together with conventional connectors.
- the converted voxel data is received in interface port 221 and stored in memory 225.
- the central processor unit 223 then assembles the 3D voxels into a virtual representation which can be displayed on screen 217.
- a graphics accelerator which is optimized for volume rendering can also be used in generating the representations.
- the virtual navigation terminal 216 can be embodied using a high speed graphics work station, such as manufactured by Silicon Graphics, Inc., or in a high speed personal computer, such as an IBM compatible computer with a Pentium III (or higher) processor having a 1GHZ or faster clock speed.
- the operator can use interface device 219 to interact with the system
- the interface device 219 can further be used to control the image being displayed, including the angle, size, rotation, navigational position and the like.
- Scanning device 205 and terminal 216, or parts thereof, can be part of the same unit. Numerous CT and MRI systems are suitable for such applications. A single platform may be used to receive the scanned image data, convert the image data to 3D voxels if necessary and perform the guided navigation.
- the surgical repair of aural atresia consists of several steps: drilling a new external auditory canal in the temporal bone; creating a new meatus in the concha; forming a new eardrum using temporalis fascia; and re-lining the bony canal with a skin graft.
- An important aspect of surgical planning in this regard is to define an acceptable path for the new auditory canal from the outer cortex to the inner ear and to visualize the important internal structures of the ear, including the relative spatial relationships among these structures.
- the present method of medical treatment planning provides for 3D imaging of the structures of the ear and allows interactive positioning of a virtual drilling site which will form the external auditory canal in the temporal bone. Using computer aided planning, this canal can be placed, analyzed and repositioned if necessary. If a number of proposed plans are acceptable, several variants can be saved in a database for comparison. Once the final location is accepted, then a virtual drilling operation can be performed to determine if the planned surgery is likely to be successful. Once the canal is virtually drilled, the user can navigate into and through the canal and visualize the surrounding structures of the ear. The operation is interactive and repeatable. The plan can be stored in the database and the surgeon can then compare different plans for the same case which have been stored in the database and choose the best available plan.
- FIG 3 is a flow chart illustrating an overview of the present method for performing surgical planning in the context of treating aural atresia.
- the primary intervention that the user, generally a surgeon, will undertake is the creation of a new external auditory canal which must be drilled through the temporal bone.
- a 3D rendering of the important structures of the ear is created using acquired 2D image scan data, such as from an MRI scan or CT scan (step 305). Because the various structures of the ear present a similar image intensity in the image scan data, it is preferable that the 3D rendering step include an image segmentation process to distinguish the various tissue types of the ear, such as the facial nerve, temporal bone and the like.
- FIG. 8 is an exemplary system display which could be observed by the user on terminal 217.
- the display preferably takes the form of a graphical user interface (GUI) which allows the image to be fully manipulated on the display by the user operating the user interface device 219.
- GUI graphical user interface
- the top of the display includes a tool bar, where commonly used commands can be readily referenced, in a manner known in the art.
- the right hand portion of the display shows a user interface portion 804, where image parameters can be observed and manipulated.
- the main display portion 805 illustrates the segmented image data showing the various structures of the ear including the cochlea 806, stapes 808, malleus 810, and the facial nerve 812.
- the user can manually edit the segmented small structures, if the user desires to improve the segmentation results.
- the user can control the color and opacity of each of the structures.
- the user can also control non-geomtetry properties of each structure, such as shading, color and opacity.
- the user can render the temporal bone transparant to view the underlying anatomy of the ear.
- the temporal bone could be rendered transparent, such that the underlying anatomic structures could be viewed through the outline of the temporal bone such that the spatial relationship among these structures could be observed.
- a 3D cylinder is used as the model for the drill bit which a surgeon will use to form the external auditory canal.
- This cylinder 902 is interactively placed by the user in a position which is expected to open a passage through the temporal bone 904 from the outer bony cortex toward the middle ear (step 310).
- the user can manipulate the length, diameter and position of the cylinder using the interface device 219 to drag the cylinder or by changing the parameters numerically in the cylinder control area 906 of the interface portion 804 of the display.
- the computer system then illustrates the placement of the cylinder, such as by displaying this region as a different color.
- the computer aided treatment terminal 216 can analyze the proximity of the cylinder in this initial placement to the critical structures surrounding the proposed drilling site, such as the facial nerve and ossicles, and issue warnings if drilling would result in potential damage to these structures (step 315). Based on the visual representation and the warnings that are presented, the user can alter the proposed intervention, such as by repositioning the cylinder, until a satisfactory path for the cylinder is determined or it is determined that the proposed treatment is not feasible. The user can also control the length and diameter of the cylinder to conform to different drilling depths and drill diameters, respectively. Once such a path has been determined, a virtual drilling operation can be performed by removing the region represented by the cylinder from the image (step 320). Of course, there may be more than one acceptable course of intervention. Accordingly, variations in the plan can be implemented and saved in a database such that alternatives can be compared and the best acceptable alternative may be selected.
- the user can navigate through the bore of the drill site to visualize, from within the treated area, the efficacy and/or risks of the treatment (step 325).
- the user can navigate completely through the new auditory canal 1102 and, at the exit point, visualize the spatial relationship of the newly formed canal with respect to the inner ear structures, such as the ossicular mass 1104.
- the colors of the objects being viewed, as well as the relative opacity of the objects can be varied by the user to present different views of the region.
- the virtual cylinder 902 is illustrated in Figures 9 and 10 as a simple cylinder having a circular cross section and a long axis.
- This simple model is a reasonable representation of the rigid drill bit used in the aural atresia surgical procedure.
- the virtual cylinder need not be a simple cylindrical model and that it can and should be sized and shaped to conform to the size and shape of the instrument that will be used to perform the contemplated treatment.
- the virtual cylinder need not have a circular cross section nor a constant diameter. For example, if the tool is generally shaped as a curved rod, a cylindrical model which is curved along its major axis would be appropriate.
- a more complex "virtual cylinder” model can be used.
- the generalized virtual cylinder can be modeled as a deformable object to fit in a specific anatomical environment and to model flexible tools, such as catheters. Regardless of the actual model employed, the principle of the method remains the same.
- a difficulty encountered in the imaging step 305 is that several of the relevant anatomical structures have similar intensity values on the CT image. This, in combination with the complex anatomy of the ear, can make it difficult to distinguish the various structures.
- a two-level image segmentation process can be employed. The two-level segmentation process involves low-level processing of the voxels in the region of interest followed by high-level organ extraction.
- the voxels of the 3D dataset are clustered into groups based on an intensity feature of the voxels, which can be measured by an associated local intensity value vector.
- This can be determined using a modified self- adaptive on-line vector quantization algorithm, such as is described in the article "A self-adaptive on-line vector quantization algorithm for MRI segmentation," by Chen et al. in the proceedings of The 7th Scientific Meeting of ISMRM, May 1999, Philadelphia, which is hereby incorporated by reference.
- each voxel is associated with a local vector which is defined in 3D space. From the local vectors, a feature vector series can be derived using a components analysis which is well known in the art.
- the feature vectors are then clustered using a self-adaptive on-line vector quantization algorithm.
- the voxels are then grouped according to the classification of their feature vectors and are assigned an integer value representing this classification.
- the high level organ extraction processing can follow. Initially, a user locates a seed, or starting point, within regions representative of soft tissue, bone and air spaces. The system then applies a region growing algorithm starting from the seed points to extract the anatomical features of the ear, such as the temporal bone, stapes, ossicles and facial nerve.
- the temporal bone which presents high contrast compared to the surrounding tissue is fairly easy to automatically segment.
- certain structures such as the inner ear and facial nerve, may require addition user input to fully delineate these structures.
- the soft tissue of the inner ear presents a similar intensity value on CT images as compared to the surrounding soft tissue.
- extraction of the facial nerve may also require manual intervention from the user, such as by manually identifying the outline of the facial nerve in each image slice in which this structure is present.
- any method which provides accurate delineation of the neighboring structures in a region of interest can be used in the practice of the present treatment planning method.
- One such technique is described in the article "On segmentation of colon lumen for virtual colonoscopy" by Liang et al, Proceedings of SPIE Medical Imaging, pp 270- 278, Feb. 1999, San Diego.
- 3D image generation can be performed for each of the segmented objects using a number of known techniques, such as the Marching Cubes algorithm, which reconstructs the outer polygonal surface.
- Marching Cubes algorithm which reconstructs the outer polygonal surface.
- the volume image dataset can be stored in a partitioned data structure, such as a binary space-partitioning (BSP) tree, in which the large dataset is parsed into relatively small portions which are stored in leaf nodes of the data structure.
- BSP binary space-partitioning
- the processing burden for interactive operations can be significantly reduced.
- the processing burden can be further reduced by use of a level of detail (LOD) rendering mode and/or a wavelet transformation to reduce the data volume.
- LOD level of detail
- the BSP-tree uses the original volume image data as the root node of the tree. This is represented by (0,0,0, dim x, dim y, dim z, n p ) where the parameters dim x, dim y, and dim z are the resolution of the original dataset and n p is the total number of polygons within the original dataset. A binary subdivision is performed on this node that geometrically divides the node into two sub-nodes if n p is greater than a predetermined constant n Q .
- the binary subdivision operation is recursively applied to each sub-node until the number of polygons in the sub-nodes is less than or equal to the value of n 0 . At this point, the resulting sub-nodes are defined as leaves in the constructed tree data structure.
- the node subdivision which is used in the tree generation process is illustrated in the algorithm illustrated in Figure 4 and the pictorial diagram of Figure 5. Referring to Figure 5, (xO, yO, zO, xl, yl, zl, np) represents a node in the tree in Cartesian space. The subdivision process occurs by dividing the node using a Cartesian plane which is perpendicular to the longest axis of the given node.
- the longest axis of the node shown is along the X axis.
- the position at which the node will be bifurcated is determined by accumulating a number of polygons slice by slice through the volume in the node area along the X axis until the accumulated number, n acc , reaches a value of n p /2.
- This exemplary construction of the BSP tree results in a substantially balanced tree structure, both in tree depth and the number of polygons in the resulting leaf nodes.
- Figure 6 A illustrates the placement of the virtual cylinder 602 of Step
- FIG. 310 represented as a voxel based volume, into a portion of the volume image 600 including a portion of the temporal bone 601.
- the drawing in Figure 6A and 6B are illustrated in two dimensions, however, it will be appreciated that the image is a 3D voxel representation.
- the image 600 is illustrated as partitioned by the leaf nodes 604, 606, 608, 610 and 612 of the BSP data structure. To determine the effect of the placement of cylinder 602, the leaf nodes which are effected by the placement of the cylinder 602 are identified, then a voxel-based constructive solid geometry (CSG) subtraction operation is performed.
- CSG constructive solid geometry
- a minimum, axis aligned bounding box 614 is generated which encompasses the cylinder (step 702).
- the bounding box 614 is partitioned into blocks 616, 618, 620, 622, 624 according to the boundaries of the leaf nodes (step 704).
- a first block is selected (step 706) and the intersection 624 of the cylinder 602 and temporal bone 601 within the selected block 604 is determined (step 708). It is within this intersection region 624 that a CSG subtraction operation and voxel regeneration operation will take place.
- the original voxels in the intersection region 624 within the block will be altered by a CSG subtraction operation, the original voxels are stored as a backup so that the last image can be restored, for example, if the cylinder 602 is subsequently repositioned (step 710).
- a subtraction of the voxel values of the cylinder 602 from the voxel values of the temporal bone 601 in the intersection region 624 determines the shape after a drilling operation (step 712).
- a regeneration process in the area of intersection region provides the new surface description (step 714).
- the new surface can be rendered in the 3D image of the block, such as by using a Marching Cubes algorithm (step 716).
- the process from step 708 to step 716 is repeated for each block within the bounding box 614 (step 718).
- a level-of-detail (LOD) rendering mode can be used.
- LOD level-of-detail
- a reduced dataset is generated from the full volume data set.
- the 512x512x256 full dataset can be reduced to a 64 x 64 x 32 reduced volume dataset using a multi-resolution decomposition with three levels of thresholding.
- polygons used to render the volume images in both the enlarged and reduced volume datasets can be extracted.
- a traditional Marching Cubes method can be used to extract polygons to fit the surface of the object.
- polygon culling can be applied by first removing those leaf nodes that are completely outside the field-of-view from current processing operations. The remaining polygons are recalled from the BSP tree, ordered and rendered in those spaces which were not culled.
- the BSP tree provides an effective tool for selecting a relevant portion of the dataset for a particular navigation or display operation.
- the enlarged and reduced datasets are cooperatively used in a two level LOD rendering mode. If a user is interacting with the object being displayed, such as rotating, shifting or effecting other changes in the field of view, the polygons from the reduced dataset (64-sized) are rendered. Because of the significantly lower number of polygons involved, interaction with the reduced dataset volume can be performed faster and with less processing overhead. The tradeoff for the increased speed is reduced image resolution. If there is no interaction from the user after a predetermined time period, the polygons of the enlarged dataset (512-sized) are selected from the BSP tree and are rendered to provide a high resolution image of the current field of view
- the large volume can be shrunk to a smaller scale space for structure analysis.
- a shrinking method based on multiresolution analysis theory can be used.
- the input data is the stack of binary images of the same size which can be obtained from the segmentation results of the CT or MRI scan.
- the x-direction is taken along the slice image width
- the y-direction is along the slice image height
- the z-direction is along the direction of slice by slice.
- the foreground voxels in the tree volume are set to value of 128 (maximum) and the background voxels are set to value 0 (minimum).
- a Daubechies' bi-orthogonal wavelet transform with all rational coefficients can be employed.
- This one-dimensional (ID) discrete wavelet transformation (DWT) is first applied along to the x-direction row by row. From application of the DWT only the lower frequency components are retained and packed. The computation is preferably implemented in floating points. Noting that the DWT is applied to the binary signal, there are two kinds of nonzero coefficients which result in the lower frequency component. The first is of value 128 and this kind of coefficients are located in the interior of the volume.
- the second is of a value not equal to 128 and these coefficients locate the boundary of the volume.
- the coefficients of the second kind are compared against a predetermined threshold value. If its absolute value is larger than a pre-set threshold Tl, the value of the coefficient is set to 128; otherwise, it is set to 0.
- Tl a pre-set threshold
- the same DWT is then applied to the resulting dataset along the y-direction column by column, where the similar thresholding is employed to the lower frequency components.
- the result is again a stack of binary images, but now with both half row and column size as compared to the original dataset.
- the DWT is applied to the last result along the z-direction and the lower frequency components are retained. This step completes the first level decomposition.
- the resulting dataset of the first level decomposition is of half size in all three directions as compared to the original dataset. If the shrinking procedure stops at this level, the finial thresholding is applied. It revalues those coefficients of nonzero and non- 128 value. If the absolute value of this kind of coefficient is larger than a pre-set threshold T2, it will be revalued as 128; otherwise, it is revalued as 0. If further shrinking is needed, the same thresholding algorithm is applied with the threshold Tl. Further shrinking proceeds as previously described, but is applied to the dataset shrunk at the last previous level. The decomposition procedure can be recursively applied until the resulting volume meets the desired reduced data volume. In the case where the slice images are of 512X512 pixel size, the maximum decomposition level is usually three, resulting in a 64x64 reduced pixel size.
- the volume is isotropically shrank in all directions with the presented method.
- the two pre-set thresholds, Tl and T2 are used to control the degree of shrinking. If the volume is significantly over shrunk, connectivity may be lost in the reduced volume. If it is over shrunk to a leaser degree, two separate branches may merge into one branch in the reduced volume dataset. The larger the two threshold values, the thinner the reduced volume is.
- the range of those two thresholds is [0, r x 128], where 0 ⁇ r ⁇ l.
- the range for virtual endoscopy is re (0.08, 0.28) for Tl and re (0.7, 0.98) for T2.
- the exact determination is dependant on the feature size of the particular application and is selected to achieve reduction while retaining the fidelity of the structure information in the shrunk volume..
- each block 616, 618, 620, 622, 624 can be evaluated to determine if tissue types, other than that of the temporal bone, are proximate the current location of the cylinder 602. If the cylinder is within a predetermined threshold distance of a critical structure, a warning can be generated. For example, if the proposed cylinder location is within 1- 2 mm of the facial nerve, a warning can be provided. The warning can take the form of a text message, an audible signal, or by highlighting the area of risk in the image for the user, such as by use of a special color or by automatically magnifying the region for closer examination by the user.
- the invention has been described in detail in connection with the treatment planning for aural atresia, it is generally applicable to numerous treatment planning operations both in the ear and elsewhere in the body.
- the surgeon inserts a catheter into an occluded artery and inflates a balloon at the end of the catheter to force the occluded artery open and to expand a stent which maintains the opening. While this has become a common procedure, it is not without risk.
- the arterial occlusion is generally related to a build up of plaque and fatty deposits on the arterial wall. If a portion of these deposits are dislodged during the angioplasty process, there is a risk of stroke and other complications.
- the artery can be imaged and, through image segmentation, the quantity and nature of the plaque deposits can be determined.
- the severity of the occlusion can be viewed by the surgeon who can navigate in the 3D image within the artery.
- a virtual intervention can then be performed, i.e., placing a virtual catheter within the arterial volume and expanding a virtual stent, and the results observed. If problems are observed, the user can then alter the course of treatment to minimize the risk.
- the virtual catheter would require a dynamic model that conforms to the contours of the interior surface of the arterial wall.
- Such a model is analogous to the force field model previously used in guiding a virtual camera along a fly path in performing virtual colonoscopy.
- the present method is applicable to treatment planning for the formation and implantation of a stent graft for treating abdominal aortic aneurisms.
- the 3D imaging can be used to determine the size, location and nature of the aneurism.
- the quality of the arterial wall can be determined by analyzing the composition of the arterial wall to determine the degree of plaque build up and stenosis.
- a virtual stent graft can be modeled to fit in the region of the AAA and the graft can be inserted into the 3D image.
- the surgical removal of plaque can be the modeled intervention. In either case, the user can navigate within the treated region to visualize the results of the proposed intervention.
- an organ such as the prostate, breasts or lungs can be scanned and rendered as a segmented 3D image.
- the image segmentation process at least partially highlights those portions of the organ of interest which have a high likelihood of containing cancerous tissue.
- a virtual cylinder can be placed into this region to simulate the insertion of a biopsy needle. The position, size and shape of the cylinder can be optimized by the user to insure that at least part of the suspicious region is within the volume of the cylinder. The region within the virtual cylinder can then be withdrawn from the organ and displayed in a different window of the display for further analysis.
- Known volume rendering techniques use one or more defined transfer functions to map different ranges of sample values of the original volume data to different colors, opacities and other displayable parameters for navigation and viewing.
- the selected transfer function generally assigns maximum opacity to the wall of the object being viewed.
- the physician can interactively change the transfer function assigned during the volume rendering procedure such that the outer surface being viewed becomes substantially transparent, allowing the interior structure of the region to be viewed.
- the suspicious area can be viewed at a number of different depths, with varying degrees of opacity assigned throughout the process.
- the shape of the region and texture of the region undergoing virtual biopsy can be analyzed to determine a likelihood of cancerous tissue in the region being biopsied.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Veterinary Medicine (AREA)
- Urology & Nephrology (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Radiation-Therapy Devices (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US10/182,217 US7630750B2 (en) | 2001-02-05 | 2001-02-05 | Computer aided treatment planning |
| AU2001238032A AU2001238032A1 (en) | 2000-02-04 | 2001-02-05 | System and method for computer aided treatment planning |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18031000P | 2000-02-04 | 2000-02-04 | |
| US60/180,310 | 2000-02-04 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2001056491A2 true WO2001056491A2 (fr) | 2001-08-09 |
| WO2001056491A3 WO2001056491A3 (fr) | 2002-02-21 |
Family
ID=22659975
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2001/003746 Ceased WO2001056491A2 (fr) | 2000-02-04 | 2001-02-05 | Systeme et procede de planification informatisee de traitement |
Country Status (2)
| Country | Link |
|---|---|
| AU (1) | AU2001238032A1 (fr) |
| WO (1) | WO2001056491A2 (fr) |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2004051601A1 (fr) * | 2002-12-03 | 2004-06-17 | Mentice Ab | Systeme de commande de simulateur d'intervention |
| EP2255843A1 (fr) * | 2009-05-29 | 2010-12-01 | FluiDA Respi | Procédé pour déterminer les traitements en utilisant des modèles de poumon spécifiques aux patients et procédés informatiques |
| US7993141B2 (en) | 2002-12-03 | 2011-08-09 | Mentice Ab | Interventional simulator system |
| US8083524B2 (en) | 2002-12-03 | 2011-12-27 | Mentice Ab | Interventional simulator system |
| US8491307B2 (en) | 2002-12-03 | 2013-07-23 | Mentice Ab | Interventional simulator control system |
| WO2016001278A1 (fr) | 2014-07-03 | 2016-01-07 | Koninklijke Philips N.V. | Dispositif et procédé pour afficher des informations tridimensionnelles pour une procédure interventionnelle |
| US10595942B2 (en) | 2011-12-14 | 2020-03-24 | Stryker European Holdings I, Llc | Techniques for generating a bone plate design |
| EP3383266B1 (fr) * | 2015-11-30 | 2021-08-11 | Materialise N.V. | Méthode mise en oeuvre par ordinateur pour fournir un dispositif à placer dans un passage de voies aériennes |
| US12446940B2 (en) | 2020-04-09 | 2025-10-21 | Cochlear Limited | Torque limiting drive tools |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104462650B (zh) * | 2014-11-10 | 2017-11-07 | 张建卿 | 一种可实现内外结构的实体化心脏3d模型制作方法 |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6256529B1 (en) * | 1995-07-26 | 2001-07-03 | Burdette Medical Systems, Inc. | Virtual reality 3D visualization for surgical procedures |
| US6064904A (en) * | 1997-11-28 | 2000-05-16 | Picker International, Inc. | Frameless stereotactic CT scanner with virtual needle display for planning image guided interventional procedures |
| US6112750A (en) * | 1998-03-24 | 2000-09-05 | International Business Machines Corporation | Method and system for assessing risks and prognoses of a given course of medical treatment |
| US6165193A (en) * | 1998-07-06 | 2000-12-26 | Microvention, Inc. | Vascular embolization with an expansible implant |
-
2001
- 2001-02-05 WO PCT/US2001/003746 patent/WO2001056491A2/fr not_active Ceased
- 2001-02-05 AU AU2001238032A patent/AU2001238032A1/en not_active Abandoned
Cited By (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8083524B2 (en) | 2002-12-03 | 2011-12-27 | Mentice Ab | Interventional simulator system |
| WO2004051601A1 (fr) * | 2002-12-03 | 2004-06-17 | Mentice Ab | Systeme de commande de simulateur d'intervention |
| US8491307B2 (en) | 2002-12-03 | 2013-07-23 | Mentice Ab | Interventional simulator control system |
| US7993141B2 (en) | 2002-12-03 | 2011-08-09 | Mentice Ab | Interventional simulator system |
| US8886500B2 (en) | 2009-05-29 | 2014-11-11 | Fluidda Respi | Method for determining treatments using patient-specific lung models and computer methods |
| WO2010136528A1 (fr) * | 2009-05-29 | 2010-12-02 | Fluidda Respi | Procédé utilisant des modèles de poumon spécifiques de patient pour déterminer des traitements, et procédés informatiques |
| EP2255843A1 (fr) * | 2009-05-29 | 2010-12-01 | FluiDA Respi | Procédé pour déterminer les traitements en utilisant des modèles de poumon spécifiques aux patients et procédés informatiques |
| US10595942B2 (en) | 2011-12-14 | 2020-03-24 | Stryker European Holdings I, Llc | Techniques for generating a bone plate design |
| US10610299B2 (en) | 2011-12-14 | 2020-04-07 | Stryker European Holdings I, Llc | Technique for generating a bone plate design |
| US11717349B2 (en) | 2011-12-14 | 2023-08-08 | Stryker European Operations Holdings Llc | Technique for generating a bone plate design |
| WO2016001278A1 (fr) | 2014-07-03 | 2016-01-07 | Koninklijke Philips N.V. | Dispositif et procédé pour afficher des informations tridimensionnelles pour une procédure interventionnelle |
| EP3383266B1 (fr) * | 2015-11-30 | 2021-08-11 | Materialise N.V. | Méthode mise en oeuvre par ordinateur pour fournir un dispositif à placer dans un passage de voies aériennes |
| US12446940B2 (en) | 2020-04-09 | 2025-10-21 | Cochlear Limited | Torque limiting drive tools |
Also Published As
| Publication number | Publication date |
|---|---|
| AU2001238032A1 (en) | 2001-08-14 |
| WO2001056491A3 (fr) | 2002-02-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US7630750B2 (en) | Computer aided treatment planning | |
| US7356367B2 (en) | Computer aided treatment planning and visualization with image registration and fusion | |
| Cebral et al. | From medical images to anatomically accurate finite element grids | |
| US6362821B1 (en) | Surface model generation for visualizing three-dimensional objects using multiple elastic surface nets | |
| EP0965104B1 (fr) | Procedes d'autosegmentation et d'autocontournage pour l'etablissement d'un plan de traitement de radiotherapie tridimensionnelle | |
| EP2194505B1 (fr) | Procédé et dispositif pour segmenter la colonne vertébrale et de l'aorte dans les données d'imagerie médicale en fonction d'un atlas squelettique | |
| CN100553561C (zh) | 在ct血管造影术中分割结构的方法及设备 | |
| US7747305B2 (en) | Computer-aided-design of skeletal implants | |
| EP1631931B1 (fr) | Procedes et systemes de placement d'implants guide par l'image | |
| WO2022183719A1 (fr) | Procédé et dispositif de planification préopératoire à base d'apprentissage profond pour chirurgie de révision de remplacement total de la hanche | |
| Caponetti et al. | Computer-aided simulation for bone surgery | |
| CN100421128C (zh) | 用于对断层图像数据分段的方法和图像处理系统 | |
| US10105145B2 (en) | Method for constructing a patient-specific surgical guide | |
| WO2001056491A2 (fr) | Systeme et procede de planification informatisee de traitement | |
| EP2821968B1 (fr) | Données de modèle de traitement représentant une section de corps biologique ayant une partie défectueuse | |
| Robb et al. | Patient-specific anatomic models from three dimensional medical image data for clinical applications in surgery and endoscopy | |
| EP0354026B1 (fr) | Affichage en trois dimensions de données tomographiques | |
| Shiaa et al. | A Novel Method Based on Interpolation for Accurate 3D Reconstruction from CT Images. | |
| Chan et al. | A virtual surgical environment for rehearsal of tympanomastoidectomy | |
| Stenzl et al. | Virtual reality of the lower urinary tract in women | |
| Todd et al. | An analysis of medical image processing methods for segmentation of the inner ear | |
| Salah et al. | Preoperative planning of a complete mastoidectomy: semiautomatic segmentation and evaluation | |
| Krol et al. | Computer-aided osteotomy design for harvesting autologous bone grafts in reconstructive surgery | |
| Stacy et al. | High performance computing in biomedical imaging research | |
| Wu | Accurate and efficient three-dimensional mesh generation for biomedical engineering applications |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
| AK | Designated states |
Kind code of ref document: A3 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
| REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 10182217 Country of ref document: US |
|
| 122 | Ep: pct application non-entry in european phase | ||
| NENP | Non-entry into the national phase |
Ref country code: JP |