[go: up one dir, main page]

WO2007049323A1 - Appareil servant a deplacer des instruments chirurgicaux - Google Patents

Appareil servant a deplacer des instruments chirurgicaux Download PDF

Info

Publication number
WO2007049323A1
WO2007049323A1 PCT/IT2006/000758 IT2006000758W WO2007049323A1 WO 2007049323 A1 WO2007049323 A1 WO 2007049323A1 IT 2006000758 W IT2006000758 W IT 2006000758W WO 2007049323 A1 WO2007049323 A1 WO 2007049323A1
Authority
WO
WIPO (PCT)
Prior art keywords
equipment according
image
images
detection device
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IT2006/000758
Other languages
English (en)
Inventor
Manolo Omiciuolo
Massimo Pagani
Simone Pio Negri
Vito Basile
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sintesi ScpA
Original Assignee
Sintesi ScpA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sintesi ScpA filed Critical Sintesi ScpA
Priority to EP06821748A priority Critical patent/EP1951141A1/fr
Publication of WO2007049323A1 publication Critical patent/WO2007049323A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • A61B2017/00699Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body correcting for movement caused by respiration, e.g. by triggering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • A61B2017/00703Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body correcting for movement of heart, e.g. ECG-triggered
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0801Prevention of accidental cutting or pricking
    • A61B2090/08021Prevention of accidental cutting or pricking of the patient or his organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation

Definitions

  • the object of the present invention is an equipment for moving medical elements, such as by way of non-limiting example, surgical elements.
  • Technological background of the invention is an equipment for moving medical elements, such as by way of non-limiting example, surgical elements.
  • diagnostic imaging With reference to diagnostic activities, various types of diagnostic imaging are known to be currently available in order to obtain digital imaging of anatomic details, organs and tissues, which allow identifying morphological and/or functional characteristics that may be traced to particular physiological or pathological conditions.
  • Ultrasound, radiographic, nuclear magnetic resonance and scintigraphic detections can be considered as a non-exhaustive example. These detections have generally the purpose of generating digital images of organs and/or tissues, which point out morphologic and/or functional properties even with the aid of contrast means and/or injectable radioactive tracers.
  • the accuracy in the detection of neoplasias appears to be a crucial point in order to be suitably ensured that the tissue intended to be removed and/or taken has been actually excised.
  • this localization occurs by analyzing the results of the examinations that have been carried out during the diagnosis, and consequently, by identifying the region to be operated on the surface of the animal's or patient's body, which corresponds to the inner portion of interest (organ, tissue or the like).
  • a mark is made on the body's surface, which indicates the region to be operated, hi any case, the identification of this region on the body's surface is carried out by means of a visual analysis of the body by an operator in charge, such as a surgeon.
  • the object of the present invention is to provide an equipment for moving medical elements that is alternative to known equipment, and for example, that overcomes the limitations of the above-mentioned known techniques on the localization of a patient's body region to be operated.
  • the object of the present invention is achieved by an equipment for moving medical elements such as described in Claim 1, and preferred embodiments thereof as described in Claims 2 to 37.
  • the object of the present invention is also a method for using an equipment for moving medical elements such as defined in Claim 38 and preferred embodiments thereof as defined in Claims 39 and 40.
  • the equipment 1 is capable of carrying out in-vivo diagnoses by multi-modal and real-time imaging; the equipment 1 can further carry out the in-vivo localization of morphological and/or functional alterations and controlling the moving of surgical elements.
  • in-vivo is meant herein to refer to diagnoses and detections that are carried out on living beings that are preferably anaesthetized.
  • multi-modal is meant herein to refer to the possibility of using several detection methodologies.
  • the equipment 1 comprises a medical element 50, which is moved as a function of preset detections carried out and preset controls imposed by the operator, as will be better detailed below.
  • This medical element 50 can comprise, for example, a surgical element and/or a pointer element .
  • the surgical element can be any element that is capable of surgically operating a patient's body region , and which can comprise for example, a tool suitable to carry out incisions, holes, injections, taking liquid or solid samples, suitable to be operated according to a suitable movement (e.g. a biopsy needle).
  • the pointer element is capable of operating on a patient's body region by indicating and pointing out the latter.
  • the pointer element is optical and comprises such a device as to send a visible radiation beam impinging on a limited area of a surface in this region (for example, also by projecting an image, such as a point, a circle, a X, or the like) such as to make it identifiable and seen by naked eye with a sufficient precision.
  • the pointer element can preferably comprise a laser-emitting device.
  • the medical element 50 includes both the surgical element and the pointer element, which can be suitably moved in an automated manner, hi accordance with a second embodiment, the medical element 50 includes only the pointer element, whereas the tools of the surgical element can be of a manual type and not moved in an automated manner.
  • the medical element 50 is of a surgical type and includes the surgical tools thereof, but is not provided with a pointer element. If not stated otherwise, exemplary reference will be made herein below to this third embodiment, though, however, the description below can be also applied to the other embodiments of the medical element 50.
  • Suitable moving elements 51 can be associated to the surgical element 50, such as an electric and/or pneumatic motor which operatively drive the element 50 to move the latter in one ore more directions.
  • Both the surgical element 50 and the moving means 51 thereof can be mounted to a suitable support structure (not illustrated herein).
  • the surgical element 50 is preferably mounted to the same structure to which at least one of the first, second and third detection devices 10, 20, 30 is mounted, which are described herein below.
  • the equipment 1 comprises a first detection device 10 for identifying a patient's region of interest to be operated.
  • the first detection device 10 can be suitable, according to an example, to carry out a computer-aided Single Photon Emission Computer Tomography (SPECT), and can be provided with a suitable gamma camera.
  • SPECT computer-aided Single Photon Emission Computer Tomography
  • the device 10 is a light-weight, small-sized scintigraphic system with small scan area and high spatial resolution.
  • the first detection device 10 acquires, with high sensibility, functional types of data, i.e. concerning the behaviour of a specific gamma-emitting radio pharmaceutical that is intravenously injected into the patient.
  • the first detection device 10 provides a plurality of first identification images 11 that are time-ordered, i.e. detected in subsequent time instants from each other.
  • the first detection device 10 can be associated with a respective first moving element 51c that is arranged for moving the first detection device 10, as well as with the moving system 51 in general.
  • the acquisition process performed by the first device 10 can be carried out by a suitable mutual moving between the detection device 10 itself and the portion to be examined.
  • the first moving element 51c can be driven by a control device by means of which a user can define the scanning typology as desired.
  • the equipment 1 further comprises a central control unit (or central processing unit) 40 that is operatively associated at least with the first detection device 10 in order to process the images detected by the latter, and to operate said surgical member 50 as a function of these images by means of a suitable command signal 100.
  • a central control unit or central processing unit 40 that is operatively associated at least with the first detection device 10 in order to process the images detected by the latter, and to operate said surgical member 50 as a function of these images by means of a suitable command signal 100.
  • control unit 40 can comprise a pre-processing block 41 being provided with a filtering block 41a that is arranged for filtering the noise and disturbances of the signal received from the first detection device 10.
  • the pre-processing block 41 can further comprise a scaling block 41b, which is arranged for re-processing the signal from the first detection device 10 (or from the filtering block 41a) such that the corresponding display occurs in a determined scale.
  • the pre-processing block 41 can further comprise a spatial alignment block 41c, which is arranged for re-processing the signal from the first detection device 10 (or from the filtering block 41a or scaling block 41b) such that the corresponding display occurs in a determined reference system.
  • scaling and spatial alignment the following technique is to be considered as a preferred, though not exclusive, example: to guide the alignment, the absolute coordinates of several pixels (2D case) and voxels (3D case) can be used.
  • the transformation between two coordinate systems is given by the following expression:
  • x is the coordinates of the "aligned" image (i.e. the image suitably scaled and referred to the desired reference system)
  • y is the coordinates of the initial image (for example, the first identification image 11)
  • A is the scaling and rotation matrix
  • T is the translation matrix.
  • the equipment 1 further comprises one or more sensors 60 that are operatively associated with the patient's body.
  • the sensors 60 have the task of detecting the patient's heart rate, the movements associated with breathing and/or the movements associated with tremor events.
  • the sensors 60 are arranged to detect physiological signals, such as mechanical and/or electrical, and/or electromagnetic and/or electrochemical and/or the like, preferably periodical, which are generated by the patient's body.
  • the sensors 60 are connected to the processing unit 40, such that "blurring" events (i.e. interference between the vibrations generated by the patient's body and the detections carried out by the first device 10) can be avoided, which are due to the movements of the subject being observed and/or due to other events that are a disturbance to the detections carried out by said device.
  • the processing unit 40 in fact, is provided with a synchronization block 42, which is operatively associated with at least the first detection device 10 in order to synchronize the signal incorporating the first images 11 with the signals 60a generated by the sensors 60.
  • the synchronization block 42 is collected downstream of the preprocessing block 41, such that the synchronization block 42 can operate on images that have been processed and filtered beforehand. This synchronization is carried out particularly with reference to the frequencies and amplitudes of the movements that are generated by the patient's body.
  • the equipment 1 can further comprise a graphic communication interface 70, by means of which an operator can interact with the equipment 1.
  • the communication interface 70 is provided with data presentation means, such as a monitor or similar display device, by means of which it provides the operator with the information that is detected by the first detection device 10.
  • the communication interface 70 is further provided with data input means, which can be employed by the operator to input information and/or commands. Particularly, when the result of the first detection has been displayed, the operator can decide whether to proceed with a further inspection, by means of a subsequent detection which will be described below, or to input a command to the equipment 1 in order to activate the generation of said command signal 100, such that the surgical element 50 can be moved according to the data available so far.
  • the subsequent detection can be carried out using a second detection device 20 and/or a third detection device 30, as will be discussed herein below.
  • the equipment 1 can comprise a second detection device 20, in order to acquire a plurality of three-dimensional images of at least one portion of the patient's body; preferably, this portion comprises the region identified by the first detection device 10.
  • the second detection device 20 has the task of acquiring geometrical (or morphological) information at least on said region.
  • This acquisition can be generally carried out throughout the patient's body, however, in order to limit the duration and complexity of the operation, as well as the discomfort caused to the patient, the acquisition of the three-dimensional images is provided to be carried out only on a portion of the patient's body.
  • the second detection device 20 can be a 2D and/or 3D scanning system suitable for acquiring the external morphology of a limited region of the patient, by means of a plurality of two-dimensional and three-dimensional images; for example, the second detection device 20 can be a structured light acquisition system.
  • the second detection device 20 acquires a plurality of second acquisition images 21 that are time-ordered, i.e. detected in subsequent time instants from each other.
  • the second detection device 20 can be associated with a respective second moving element 5 Id that is arranged for moving the second device 20, and belonging to the general moving system 51.
  • the acquisition process performed by the second device 20 can be carried out by a suitable mutual moving between the detection device 20 and the portion to be examined.
  • the second moving element 51d can be driven by a command device, by means of which a user can define the type of scanning as desired; preferably this command device is the same command device that operatively drives the first moving element 51c.
  • the second detection device 20 is operatively associated with the filtering block 41a, such that the second images 21 can be filtered by disturbances or noise that can be present.
  • the second detection device 20 can be operatively associated with the synchronization block 42, such that the signals incorporating the second images 21 can be synchronized with the signals 60a generated by the sensors 60.
  • the control unit 40 is operatively associated both with the first and second detection devices 10, 20 in order to combine the identification of the first device 10 with the acquisition carried out by the second device 20; following this combination, the control unit 40 generates a corresponding display signal for the interface 70, which helps the operator to generate a command 100 for moving the surgical element 50 as a function of this combination.
  • the command signal 100 can be sent to the moving means 51, which act on the architecture and thus on the surgical element 50, such that the latter can be moved as desired.
  • control unit 40 can comprise, within the pre-processing block 41, an adaptation unit 41b, which is arranged for referring the images that are detected by the first and second detection devices 10 and 20 to a same (either reduction or magnification) scale, such that the images result to be comparable to each other; this operation is generally indicated as the "scaling".
  • an adaptation unit 41b which is arranged for referring the images that are detected by the first and second detection devices 10 and 20 to a same (either reduction or magnification) scale, such that the images result to be comparable to each other; this operation is generally indicated as the "scaling".
  • the selected reference scale can be that of the first identification images 11, that of the second acquisition images 21, or a scale other than the preceding ones.
  • the control unit 40 can further comprise, within the pre-processing block 41, a spatial alignment block 41c in order to refer the images provided by the first and second detection devices 10, 20 to a same spatial reference system.
  • the detections carried out by the first and second devices 10, 20 are combined in a same spatial reference system, such that the various images from the two devices 10, 20 can be superimposed and the information provided in said first and second images 11, 21 is simultaneously available.
  • the spatial alignment block 41c is connected downstream of the adaptation unit 41b such that the detected images are inserted in the same spatial reference after they have been all transformed to the same scale.
  • the synchronization block 42 can be connected to the filtering block 41a and/or the spatial alignment block 41c; thereby, the images provided by the second detection device 20 can be also synchronized with the signals 60a from the sensors 60.
  • connection interface 70 the operator is provided both with the information detected by means of the first detection device 10, and with the information detected by means of the second detection device 20; the operator can, at this stage, decide whether to activate the generation of said command signal 100, or to input a command in the equipment 1 in order to carry out with further detections.
  • the control unit 40 can also comprise time alignment means 43, in order to refer the identification of the first device 10 and the acquisition of the second device 20 to a same time scale: this results in a command imposing the simultaneous acquisition by the two devices 10 and 20.
  • the time alignment means 43 have thus the task of setting the first identification images 11 (detected by the first device 10) and the second acquisition images 21 (detected by the second device 20) in a same time reference system.
  • the first and second detection devices 10, 20 have a same image detection frequency [or one is a multiple of the other], and are particularly in phase to each other, such as to obtain said simultaneous detection.
  • the time alignment means 43 ensure the simultaneity of the identifications by the first device 10 an/or the identifications carried out by the second device 20, obtaining said time alignment.
  • the purpose of the time alignment means 43 is to associate each of said first identification images 11 with at least one of said second acquisition images 21, these first and second images 11, 21 relating to an identification and an acquisition in a same instant, respectively. This is necessary for carrying out a dynamic multimodal display.
  • a second identification image 11 which is obtained like said first image 11 and temporally subsequent thereto, is associated with a second acquisition image 21 that is also obtained like said first image 21, and temporally subsequent thereto.
  • the same work is done for the subsequent images.
  • two sequences are generated, which are synchronized to each other, of images 11 and 21 suitable to represent the time evolution of the events detected by the devices 10 and 20.
  • the control unit 40 can further comprise a reconstruction block 44, which is operatively associated at least with the first and second detection devices 10, 20; preferably, the reconstruction block is connected downstream of the time alignment means 43.
  • the reconstruction block 44 has the task of obtaining a suitable reconstruction, relative to the scanning carried out by the devices, of the signals detected by the detection devices 10, 20.
  • suitable mathematical algorithms process the signals and/or information from the block 43 such as to generate a planar image and/or a volume reconstruction, and/or a surface reconstruction in the space.
  • mathematical algorithms for volume reconstruction can be "Filtered Back Projection” (FBP), and/or “Ordered Subset Expectation Maximisation” (OSEM), and/or the like.
  • the control unit 40 can further comprise the composition block 45, which is operatively associated with at least the first and second detection devices 10, 20.
  • the composition block 45 is arranged to combine with each other (for example, by means of a pyramidal technique, "wavelet", and the like) the information incorporated in the images supplied by the detection devices 10, 20 such as to obtain a corresponding fusion image 123.
  • composition block 45 can be operatively associated with the synchronization 42, time alignment 43, and reconstruction 44 blocks, such that the signal incorporating the fusion image 123 can be synchronized with the signals 60a generated by the sensors 60.
  • a pyramid is defined as a sequence of auxiliary images where each level in the pyramid is a filtered and subsampled copy of the preceding level.
  • the lowermost level in the pyramid has the same scale as the original image (for example, the first identification image 11) and contains the information of higher resolution than the remaining levels in the pyramid.
  • the highest levels in the pyramid have a lower resolution, but they have a higher scale than the original image.
  • the basic concept is making a pyramid for the fused image (for example, the fusion image 123) from the pyramids of each starting image (for example, the first and second images 11, 21).
  • the fusion image 123 is then obtained by operating an inverse transformation on the pyramid.
  • the first step is making the pyramid for each source image; the fusion is thus obtained for each level in the pyramid using a selection principle that is based on the absolute maximum luminosity or on component average or other selection principles.
  • the fused image (for example, the image 123) of the fused pyramid is reconstructed.
  • composition block 45 can comprise a first processing block
  • the first auxiliary image 12 has lower resolution and higher scale than the first identification image 11 from which it has been generated; in other words, the first auxiliary image 12 is smaller and less defined than the first starting image 11.
  • the composition block 45 comprises a processing block 45b which is operatively associated with the second detection device 20 in order to receive the second acquisition image 21 that is associated with the first identification image 11 and to generate a corresponding second auxiliary acquisition image
  • the second auxiliary acquisition image 22 has a lower resolution and a higher scale than the first image 21 from which it has been generated; in other words, the second auxiliary image 22 is smaller and less defined than the second starting image 21.
  • the auxiliary images 12, 22 have the same resolution; furthermore, the first and second auxiliary images 12, 22 have the same reduction scale relative to the real dimensions of the imaged area of human body.
  • composition block 45 can further comprise combination means 45d that are operatively associated with the processing blocks 45a and 45b for generating a fusion image 123 as a function of the combination of the first and second auxiliary images 12, 22.
  • composition means 45d have the task of combining with each other the information incorporated in the first auxiliary image 12 and in the second auxiliary image 22, such as to obtain said fusion image 123, in which the relevant data from the first and second images 11, 21 (from which the images 12, 22 have been generated) are suitably combined.
  • a morphological-functional imaging i.e. an image simultaneously containing morphological and functional information of a same portion of tissue.
  • the control unit 40 sends it to the graphic interface 70.
  • the operator then activates the generation of said command signal 100, as a function of said fusion image 123, by means of the suitable moving control system 46; i.e. according to what is shown in the fusion image 123, the control unit 40 provides to move the surgical element 50 such that the latter can properly operate, preferably following a confirm command being entered by the operator, after the latter has checked the contents of the fusion image 123.
  • the procedure of tissue sampling via a needle will be described below: the steps of acquisition, processing, synchronization and fusion as described above generate a video image with a spatial and functional content that is subjected to the operator's interpretation.
  • the system is capable of providing the spatial coordinates, relative to a known reference system, of the location in which the operator decides to take some tissue.
  • the central processing unit by means of suitable routines of direct and inverse kinematics (by way of example, refer to the Denavit-Hartenberg matrixes), guides the moving and orientation, in order to adopt an operative position of the pointer and any surgical element, such as the biopsy needle. After the operative position has been reached, the moving and actuation of the medical element can be commanded, i.e. according to the example, the biopsy sampling, or the activation of the laser emitter indicating the region to be operated can be carried out.
  • the equipment 1 further comprises a third detection device 30, in order to make the operation of the equipment 1 further accurate and reliable.
  • the third detection device 30 can be, for example, a ultra-sound and/or radiographic and/or nuclear magnetic resonance scanning system; this device plays a major role in the acquisition of morphological images of hard and/or soft tissues.
  • the third detection device 30 comprises a ultrasound and/or radiographic and/or nuclear magnetic resonance probe.
  • the control unit 40 is operatively associated with the third detection device 30 in order to combine, within said coordinate system, the detection by the third detection device 30 with the acquisition of the first device 10 and/or with the identification of the second device 20.
  • the images detected by the third detection device 30 are processed such as to be referred to the same spatial coordinate system as used for the acquisition carried out by the first device 10 and/or for the identification carried out by the second device 20.
  • the detections of the third device 30 can be superimposed to what has been detected by the first and/or second devices 10, 20, such that the various available information can be simultaneously used for moving the surgical element 50.
  • the third detection device 30 is operatively associated with the filtering block 41a, such that the images provided by the third device can be filtered from disturbances or noise that may be present.
  • the third detection device 30 is also operatively associated with the adaptation unit 41b, such that the images detected by the third device 30 can be referred to the same scale as the images from the first and/or second detection devices 10, 20.
  • the selected reference scale can be that of the first image 11, that of the second image 21, that of the images detected by the third device 30, or a scale other than the preceding ones.
  • the third detection device 30 is also operatively associated with the spatial alignment block 41c, such that the images detected by the third device 30 can be referred to the same spatial reference as the images from the first and/or second detection devices 10, 20.
  • the third detection device 30 is also operatively associated with the synchronization means 42 such that the signal incorporating the third detection images 31 is synchronized with the signals 60a that are generated by the sensors 60.
  • the third detection device 30 is also operatively associated with the time alignment means 43, such that it can carry out detection simultaneously with the detection that is carried out by the devices 10 and 20. The detection by this third device 30 is thus referred to the same time scale as used for the acquisition of the first device 10 and/or the identification of the second device 20.
  • the third detection device 30 has the same image detection frequency (but it may be also a multiple or sub-multiple of the frequency) as the first and/or second detection devices 10, 20, and particularly, the detection of the third detection device 30 is in phase with the detection of the first and/or second detection devices 10, 20; thereby, a substantially simultaneous detection of the devices used can be obtained.
  • the third detection device 30 provides a plurality of third detection images 31, which are time-ordered relative to each other.
  • the reconstruction and composition blocks 44 and 45 can be also associated with the third detection device 30 to combine the images acquired by the latter with those of the first and/or second detection devices 10, 20 to obtain a corresponding fuse image 123 and a command signal 100 that is preferably intended for the moving element 51.
  • the composition block 45 can comprise a third processing block 45c that is operatively associated with the third detection device 30 to receive at least one third image 31 and generating a corresponding third auxiliary image
  • the third auxiliary image 32 has lower resolution and higher scale than the third image 31 from which it has been generated.
  • the resolution and scale provided by the third auxiliary image 32 are substantially the same as provided by the first auxiliary image 12.
  • Said combination means 45d can be also operatively associated to the third detection device 30 for generating the fusion image 123 also as a function of the third auxiliary image 32.
  • the command signal 100 can be generated also as a function of what has been detected by the third device 30.
  • the operator is provided with the possibility of activating the generation of the command signal 100 or, alternatively, in the case where the detections that have been carried out will prove to be insufficient, the possibility of activating one or more of the detection devices 10, 20, 30.
  • the first processing block 45a can be arranged for generating a plurality of first auxiliary images 12 from an individual first image 11. hi this case, these first auxiliary images 12 have, progressively, a lower resolution and a higher scale than the first starting image 11.
  • a virtual pyramid is generated, which is defined by the sequence of first auxiliary images 12, in which each level - downward up to the vertex - is a filtered and subsampled copy of the auxiliary image of the lower level.
  • the lowermost level in the pyramid thus consists of the first source image 11; the uppermost levels have a lower resolution and a higher scale than the original image 11.
  • the latter can, in fact, generate a plurality of third auxiliary images 32 that have, in a progressive sequence, a lower resolution and higher scale than the third source image 31.
  • first and second auxiliary images 12, 22 that occupy corresponding levels have the same resolution and the same scale as the corresponding first and second source images 11, 21. Furthermore, first and third auxiliary images 12, 32 that occupy corresponding levels also have the same resolution and the same scale as the corresponding first and third source images 11, 31.
  • the command unit 40 can be arranged for carrying out a volume reconstruction and a consequent tomographic imaging of the region of interest, starting from one or more scintigraphic detections that is obtained via the first detection device 10; practically, a series of acquisitions is carried out from different points of view, which can be combined in order to have the functional information according to the tomographic technique.
  • the various functional blocks comprised within the control unit 40 have been separately and individually presented only to explain the different functionalities of the control unit 40; actually, however, the control unit 40 can be made as an individual electronic device, which is suitably programmed to carry out the operations described above. The invention achieves considerable advantages.
  • the equipment according to the invention allows transferring the precision and reliability of the detections that have been carried out during the diagnostic step to the surgical step. Thereby, the total quality and therapeutic effectiveness of the operation are significantly improved, while reducing the duration of the latter and the discomfort caused to the patient. Furthermore, the equipment according to the invention allows tracing, localizing and pointing out in a precise manner, directly on the patient and on the same site where the operation will be carried out, the exact location in which the operation has to be carried out, thereby significantly reducing the positioning and alignment errors that are generated when the diagnostic scanning and therapeutic operation are carried out a separated place and time.
  • a further significant advantage offered by the device is the possibility of repeating the diagnostic scanning for several times either during (i.e. on-line) and/or at the end of the operation, such as to be capable of checking the result simultaneously with the operation, by following the time dynamic of the pathologic and/or physiologic event being inspected, or as a final check, which ensures how a sampling has been actually carried out according to the preceding diagnostic indications. It should be observed how a further advantage derives from the possibility of using the device s once suitably sized, for small animals: i.e. all the diagnosis functions are transferred "in- vivo" for real-time images guiding any medical element also on those subjects (by way of non-limiting examples, mice and rats) that are used for medical and pharmacologic research.
  • the particular solution using several detection devices advantageously offers the possibility of integrating various types of information as they come from distinct dedicated diagnostic instruments.
  • the possibility of carrying out a fusion of the information incorporated within the images provided by the various detection devices results notably advantageous when the apparatus is used in-vivo on patients or animals.
  • the introduction of integrated techniques gains importance not only in guided surgery but generally also in those examinations on pathologies that require a great precision, such as neoplasias at an initial stage.
  • the possibility offered by particular embodiments of the invention of using different morphological techniques confers a high degree of flexibility within the possible applications, in that the morphological and functional techniques can be selected and adapted to particular requirements, such as machines for operating rooms, diagnosis machines and biopsy sampling systems, and machines for in- vivo pharmacologic and diagnostic research on animals.
  • the equipment Based on clinical inspections, the equipment will be provided with ultrasound or X-ray techniques integrated with scintigraphic techniques, by dimensioning the detection field to the pathology size or to the most suitable regions to be explored in the scintigraphic mode, by means of linear and/or tomographic scanning.
  • the possibility of displaying the information revealed (acquired) by the detection device in accordance with an example of the invention is an advantage within the pharmacologic kinetics in that it offers an added value in the quality and quantity study of the drug behaviour and patient's reaction thereto.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Image Processing (AREA)

Abstract

L’invention concerne un appareil (1) servant à déplacer des instruments chirurgicaux, comprenant un ou une pluralité de dispositifs de détection (10, 20, 30) permettant d’identifier une région à opérer sur le corps d’un patient et un instrument chirurgical (50) permettant de réaliser une opération dans cette région. L’appareil (1) comprend en outre un module de commande (40) associé de façon fonctionnelle aux dispositifs (10, 20, 30) en vue de recevoir un signal représentant au moins l’identification réalisée par l’un des ces dispositifs et de générer un signal de commande (100) correspondant afin de déplacer l’instrument chirurgical (50) en fonction de l’identification.
PCT/IT2006/000758 2005-10-28 2006-10-27 Appareil servant a deplacer des instruments chirurgicaux Ceased WO2007049323A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP06821748A EP1951141A1 (fr) 2005-10-28 2006-10-27 Appareil servant a deplacer des instruments chirurgicaux

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ITMI2005A002060 2005-10-28
IT002060A ITMI20052060A1 (it) 2005-10-28 2005-10-28 Apparecchiatura per la movimentazione di organi chirurgici

Publications (1)

Publication Number Publication Date
WO2007049323A1 true WO2007049323A1 (fr) 2007-05-03

Family

ID=37745588

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IT2006/000758 Ceased WO2007049323A1 (fr) 2005-10-28 2006-10-27 Appareil servant a deplacer des instruments chirurgicaux

Country Status (3)

Country Link
EP (1) EP1951141A1 (fr)
IT (1) ITMI20052060A1 (fr)
WO (1) WO2007049323A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5488674A (en) * 1992-05-15 1996-01-30 David Sarnoff Research Center, Inc. Method for fusing images and apparatus therefor
US5676673A (en) * 1994-09-15 1997-10-14 Visualization Technology, Inc. Position tracking and imaging system with error detection for use in medical applications
DE10045779A1 (de) * 2000-07-22 2002-02-21 Robert Boesecke Aussistierender Medizinroboter
US20030114743A1 (en) * 2001-12-19 2003-06-19 Kai Eck Method of improving the resolution of a medical nuclear image
US20030128801A1 (en) * 2002-01-07 2003-07-10 Multi-Dimensional Imaging, Inc. Multi-modality apparatus for dynamic anatomical, physiological and molecular imaging
US20040243147A1 (en) * 2001-07-03 2004-12-02 Lipow Kenneth I. Surgical robot and robotic controller

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5488674A (en) * 1992-05-15 1996-01-30 David Sarnoff Research Center, Inc. Method for fusing images and apparatus therefor
US5676673A (en) * 1994-09-15 1997-10-14 Visualization Technology, Inc. Position tracking and imaging system with error detection for use in medical applications
DE10045779A1 (de) * 2000-07-22 2002-02-21 Robert Boesecke Aussistierender Medizinroboter
US20040243147A1 (en) * 2001-07-03 2004-12-02 Lipow Kenneth I. Surgical robot and robotic controller
US20030114743A1 (en) * 2001-12-19 2003-06-19 Kai Eck Method of improving the resolution of a medical nuclear image
US20030128801A1 (en) * 2002-01-07 2003-07-10 Multi-Dimensional Imaging, Inc. Multi-modality apparatus for dynamic anatomical, physiological and molecular imaging

Also Published As

Publication number Publication date
ITMI20052060A1 (it) 2007-04-29
EP1951141A1 (fr) 2008-08-06

Similar Documents

Publication Publication Date Title
JP5417609B2 (ja) 医用画像診断装置
EP2372660A2 (fr) Appareil et procédé de génération d'images par projection, support d'enregistrement lisible sur ordinateur sur lequel est enregistré le programme correspondant
JP6950801B2 (ja) 診断画像システム
US8600138B2 (en) Method for processing radiological images to determine a 3D position of a needle
EP0406352A1 (fr) Procede et appareil de guidage d'instruments chirurgicaux utilises notamment en neurochirurgie
EP2934326A2 (fr) Système d'affichage et de mappage tridimensionnel (3d) pour machines ultrasonores de diagnostic
CN101422378B (zh) 超声诊断设备
CN103764038A (zh) X射线ct装置、图像显示装置、图像显示方法
US20130223703A1 (en) Medical image processing apparatus
CA2568442A1 (fr) Systeme et methode ameliores pour l'ablation de tumeurs
US10922812B2 (en) Image processing apparatus, x-ray diagnostic apparatus, and image processing method
CN103732149B (zh) X射线ct装置
CN103284749B (zh) 医用图像处理装置
US8625873B2 (en) Medical image processing apparatus
JP2000051207A (ja) 医用画像処理装置
JP6959612B2 (ja) 診断画像システム
EP3659512A1 (fr) Systeme et procédé de visualisation a distance d'images médicales
WO2007049323A1 (fr) Appareil servant a deplacer des instruments chirurgicaux
JP6953974B2 (ja) 診断画像システム
RU2816071C1 (ru) Комбинированная интраоперационная навигационная система с использованием генерации ультразвуковых изображений методом трассировки лучей
JP2006223333A (ja) 画像診断装置
US20240268777A1 (en) Medical image capturing apparatus and control method of the same
WO2025022416A1 (fr) Système d'analyse d'imagerie médicale pour chirurgie robotique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2006821748

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2006821748

Country of ref document: EP