WO2018007091A1 - Dispositif d'imagerie dans une salle d'opération - Google Patents
Dispositif d'imagerie dans une salle d'opération Download PDFInfo
- Publication number
- WO2018007091A1 WO2018007091A1 PCT/EP2017/063981 EP2017063981W WO2018007091A1 WO 2018007091 A1 WO2018007091 A1 WO 2018007091A1 EP 2017063981 W EP2017063981 W EP 2017063981W WO 2018007091 A1 WO2018007091 A1 WO 2018007091A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot arm
- patient
- microscope
- image
- holder
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/44—Constructional features of apparatus for radiation diagnosis
- A61B6/4429—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
- A61B6/4458—Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit or the detector unit being attached to robotic arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
Definitions
- the present invention relates to an apparatus for imaging in an operating room.
- Imaging used during an operation increases this efficiency, as it helps to conserve revision operations.
- the use of imaging systems prolongs and interrupts the operation, albeit at best only for several minutes.
- the disruption time increases dramatically as the imaging systems scan slowly, are awkward to use, or are difficult to prepare for use.
- Computed Tomography consist of a closed gantry housing the X-ray source and X-ray detector and installed in an operating room. A patient to be examined is positioned in an opening of the housing and X-ray image data is acquired over a rotation range of over 360 ° by rotation of the X-ray source and the X-ray detector.
- CT systems offer a very high image quality, due to the design, they also require a large amount of space and block access to the patient during image acquisition.
- C-arms are more flexible because the X-ray source and the X-ray detector are arranged opposite one another on the actual C-arm. The C-arm is positioned on the patient for recording and rotated around the patient over an angular range of up to 200 °. However, the increased flexibility is paid for by a smaller scan area.
- the present invention is therefore based on the object to propose a device which avoids the disadvantages mentioned, thus minimizing the interruption time of operations caused by imaging systems and at the same time as accurate information as possible can be obtained.
- An apparatus for imaging in an operating room has an X-ray source, an X-ray detector, a robot arm, which may also be referred to as a manipulator, and a control unit for driving the robot arm.
- the X-ray source and the X-ray detector are arranged on a holder of the robot arm and can be moved on the holder or in the holder.
- the device can be flexibly and quickly deployed in the operating room.
- the holder simultaneously ensures a wide scan area and, associated therewith, a high image quality.
- X-ray radiation is to be understood within the scope of this document electromagnetic radiation in the wavelength range between 5 and 250 pm. Integration into the operating room and needs-based availability minimizes patient access for a surgeon. Existing equipment of the operating room can still be used.
- a high degree of flexibility in the posi- One-dimensional reduction reduces the conical beam and metal artefacts typical of 3D-C arcs in a 3D reconstruction and increases the reconstruction volume as needed.
- the achievable image quality with the same radiation dose is higher.
- the holder may be formed with a closed housing made of a permeable to X-ray material.
- the material is still capable of penetrating the X-radiation while simultaneously protecting the X-ray source and detector from mechanical damage and contamination.
- permeable is to be understood in the context of this document in particular that at least 90 percent, preferably at least 95 percent of an incident intensity of the X-ray radiation penetrate the housing.
- the housing is annular, so that a patient to be examined can be stored centrally in the housing and recordings can be taken from all sides.
- the robot arm is designed as a six-degree-of-freedom articulated arm, i. H. three translational degrees of freedom and three rotational degrees of freedom, designed to ensure maximum flexibility.
- the robot arm usually has six axes, so it is designed as a six-axis robot.
- the robot arm preferably has a force torque sensor with which forces and torques can be detected. After registration of corresponding forces or torques and processing by the control unit, the device can then be controlled as a result of the detected signal.
- a display and input unit may be arranged to display information and / or accept inputs and forward it to the control unit.
- the information can also be provided by the control unit and include, for example, evaluated recordings.
- the control unit can therefore also be designed as a control and evaluation unit.
- a further aspect of the invention which, however, can also be implemented independently of the features discussed so far, relates to an optical recording unit. If this optical recording unit is used together with the device already described above, it is typically arranged on the robot arm, but particularly preferably on the holder or the housing. It can also be integrated in the housing.
- the optical recording unit has a microscope in order to obtain images with the highest possible resolution.
- the control unit may be designed or set up to display additional information on the display and input unit in addition to a recording recorded by the optical recording unit in order to provide as much information as possible to a user.
- the further information may in this case comprise a slice image from a data record taken by the X-ray detector, an overlay of a predefined target area and / or a structure of a three-dimensional data set.
- the others are
- the device has at least one marker which identifies one of the prominent points.
- the at least one marker is provided on its surface with a pattern of an outer frame formed of black dots, an inner frame formed of white dots and a dot pattern of black and / or white dots.
- the marker may have a symmetrical pattern to simplify the determination of a center point.
- the described device can be used within the framework of an imaging method or a method designed in accordance with the previously discussed features can be carried out with the disclosed device become.
- a computer program product has a sequence of instructions stored on a machine-readable carrier, preferably a digital storage medium, for carrying out the described method and / or for
- the computer program product can be loaded directly into an internal memory of the electronic processing unit or is already stored therein and typically comprises parts of a program code for carrying out the described method or for driving the device described, when the computer program product runs on the electronic processing unit or is performed.
- the computer program product can also comprise a computer program which has software means for carrying out the described method and / or for driving the described device when the computer program is executed in an automation system or on the control unit.
- Fig. 1 is a perspective view of the imaging device in an operating room
- Figure 2 is a view corresponding to Figure 1 of the device with X-ray source and detector. a view corresponding to Figure 2 with a display and output unit;
- FIG. 4 is a perspective view of the display and output unit with different types of activation
- FIG. 5 is a view corresponding to Figure 2 with an instrument holder;
- FIG. 6 is a side view of the microscope above the patient;
- Fig. 7 is a representation corresponding to Figure 6 of the prior art
- FIG. 9 shows a view corresponding to FIG. 8 of the recognition of prominent points or structures in three-dimensional space
- FIG. 10 shows a schematic representation of the determination of a camera position
- Fig. 11 is a schematic view of the assignment of points and structures
- FIG. 13 shows a schematic representation of a determination of prominent points in three-dimensional space by recording a plurality of video images
- FIG. 14 shows a representation corresponding to FIG. 13 with a stereo image pair received
- FIG. 15 shows a representation corresponding to FIG. 13 with the recording of an instrument
- FIG. 16 shows a representation corresponding to FIG. 13 with recording of an instrument in stereo images
- FIG. 17 is a schematic perspective view of a marker detection
- Fig. 18 is a plan view of various marker structures
- 19 is a perspective view of the marker structures in a lung biopsy opsie
- Fig. 20 is a plan view of a patient's head with applied markers
- Fig. 21 is a schematic view of an ultrasound application with markers
- FIG. 22 is a view corresponding to Figure 21 of a microscope application.
- FIG. 1 shows a perspective view of an apparatus for imaging in an operating room.
- the innovative approach presented here combines the concepts of computer tomography, cone beam tomography and robotics.
- a robot arm 1 which is configured as a six-axis robot arm, is controlled by a control unit 15 and can be moved to different positions. With one end, the robot arm 1 is on a floor of the
- a holder 2 is mounted, which consists of an annular, closed on all sides except for a cable bushing to the robot arm 1 housing, which is formed of a permeable to X-ray material.
- Device comprises in the embodiment shown in Figure 1, a base 3 for a patient table 6, but these two components can also be omitted in further embodiments.
- an interruption duration of an operation can be reduced to less than two minutes.
- All imaging components ie an X-ray source and an X-ray detector are arranged on an annular structure with a diameter of 1.5 m within the closed housing of the holder 2 and rotatably mounted there.
- An annular bearing in particular a ball bearing, allows a
- an electromagnetic drive unit which moves, for example, a sprocket on the moving bearing side one or both imaging components.
- X-ray sources it is also possible to install a plurality of x-ray sources and a plurality of x-ray detectors.
- more than one X-ray source X-ray detector combination may be used.
- the second combination of imaging components is then guided on the annular structure offset.
- the X-ray source has a diaphragm system that can restrict the beam area.
- the X-ray detector may be formed as a flat detector or a line or matrix detector.
- the aperture of the X-ray source can be changed by an electric motor such that the flat detector acts as a line detector.
- further positionable mechanical rings or rails may be present.
- (robotic) systems for biopsies, laser systems for therapy or similar can be used on these. be attached. These can be ideally used, for example, for automated or semi-automated surgical interventions due to the local relative position between the imaging and therapy system.
- the robot arm 1 described can be mounted both on the floor (as shown) and on a wall or a ceiling of the operating room or examination area.
- a combination of the robotic arm movement with the movement of the annular structure allows known or new imaging paths.
- a larger image area can be recorded than the rigid combination of X-ray source and X-ray detector enables without movement.
- the robot arm 1 can selectively move the housing or the holder 2 during the image acquisition, so as to improve the image quality and to scan deformed objects.
- the housing is provided with sensors that detect a position of the object and avoid a collision.
- the sensors can be designed, for example, electromagnetically, capacitively, inductively, optically or based on ultrasound. Sensors or detection units or markers can also be integrated or attached to potential collision objects.
- At least one force torque sensor on the robot arm 1, in particular between an end effector and the housing, ie at the end of the robot arm 1, and / or at each joint of the robot arm 1 enables a reaction to acting forces or torques.
- the sensors increase the security of the device, but can also be used specifically for interaction.
- Two-dimensional X-ray images can also be taken with the device described. At the same time, the system can very quickly change its image pickup direction without danger and thus take pictures from different directions.
- FIG. 2 shows an embodiment of an arrangement of the imaging components for cone-beam computed tomography. Recurring features are provided with identical reference numerals in this figure as well as in the following figures.
- the X-ray source 4 and the X-ray source 4 are provided with identical reference numerals in this figure as well as in the following figures.
- X-ray detectors 5 are now arranged opposite each other and are rotated in this orientation, so that they record from any lateral viewing angles of the patient.
- other imaging components of optical tomography such as lasers and cameras, in particular 3D cameras, can also be used.
- optical sensors or recording unit can, rigidly connected to the X-ray source 4 or the X-ray detector 5, a corresponding receiving unit may be arranged and an inner side of the housing for electromagnetic radiation in the visible range, d. H. permeable at wavelengths between 400 nm and 780 nm.
- electromagnetic radiation in the infrared wavelength range that is to say for wavelengths between 780 nm and 3 ⁇ m
- the imaging robot arm 1 is designed to be collaborative, ie its mechanism is sensitive to forces or contactless sen inputs during the movement.
- the device can be moved and aligned along the patient table 6, for example, by hand.
- An implementation of the hand guide can be combined with a display of the device.
- the housing of the holder 2 can be provided with a planar display and output unit 7 as a visualization unit or enclosed by it.
- the display and output unit 7 can take over both a display function and a registration of touches, it can also controls or scan results are displayed.
- Figure 4 shows a perspective view of the display and output unit 7 with different types of control.
- a force sensor-assisted multi-touch input or contactless hand gestures namely, a comfortable alignment of the device can be achieved simultaneously for display.
- the position and orientation of the housing is adjusted. For example, pulling on a housing edge 8 can cause rotation of the housing about its longitudinal axis or yaw.
- Touching housing surfaces for example the housing base surface 9 or the housing outer surface 10, causes a horizontal or vertical movement, as shown in FIG.
- the dead-man switch or footswitch can confirm or unlock the input.
- the device described is moved to the image to the patient and taken together automatically and laterally and anterior-posterior images together.
- the inner, annular structure moves with the imaging components, for example, in the horizontal beam direction, there makes an image capture, and then in the horizontal beam direction, and makes there still an image capture.
- the process can be done very quickly and with reduced risk of collision, since the movement takes place only within the annular housing.
- a scan center can be marked on both fluoroscopes.
- the device corrects its orientation accordingly and then automatically performs the three-dimensional scan.
- the housing can also accommodate other modules.
- a holding device socket 11 on a housing cover 12 or a housing bottom 13 analogous to the imaging components, so the X-ray source 4 and the X-ray detector 5, but independently of these moved around the patient table 6 and aligned.
- Further external measuring systems or actuators can be mounted on this holding device base 11, with the holding device base 11 typically being designed as an articulated arm which has a holding device 14 at an end facing away from the housing.
- a robotic assistant system for holding and guiding surgical instruments can be attached, which can interact with the retainer base 11 such that its Tool Center Point (TCP) remains fixed in space during movement of the retainer base 11.
- TCP Tool Center Point
- the surgical instrument remains stationary while the robot arm 1 can be adjusted or reoriented as needed.
- an optical pickup unit 16 can also be arranged on or in the holder 2. In further exemplary embodiments, however, this optical recording unit 16 can also be positioned detached from the holder 2.
- the fixed connection allows direct navigation in the taken of the optical recording unit 16 recordings.
- a customary in external navigation systems registration of the patient to the image data set is therefore eliminated, which simplifies the handling of clinical navigation and opens up new possibilities of automation.
- an articulated arm or robotic arm attached to the annular housing may align a guide of a biopsy needle to the patient. In the first step, a 3D x-ray scan would be taken by the patient and the tissue to be biopsied would be marked by the operator on the control panel of the display and output unit 7. The articulated arm can align the guide of the biopsy needle so that the insertion of the needle exactly removes the marked tissue.
- the described device can be used both in a medical
- the optical pickup unit 16 already described may be attached to the housing as shown in FIG. 5, but in another embodiment it may also be used detached from the housing and without using the previously described device with other x-ray imaging systems. Therefore, a method for navigating a digital surgical microscope on the basis of video image data is described below, with which additional information can be superimposed by means of augmented reality in a video image of the surgical microscope or surgical instruments that move in the field of view of the microscope can be navigated.
- Additional information in this sense can be planning data for the operation or slice images from a preoperatively recorded three-dimensional patient data set.
- the preoperatively recorded three-dimensional patient data set can result from X-ray computed tomography, magnetic resonance tomography or another three-dimensional imaging method.
- Scheduling data are markers in the patient record that the physician or other user makes in the dataset prior to surgery to help differentiate or simplify finding structures.
- the planning data includes a target area that defines the structures where the surgical intervention takes place or a safety area that defines structures that must under no circumstances be violated.
- Slices, which are displayed in the microscope image are generated from the patient data set and show the view of the microscope on the patient in a lower level of the patient.
- the position determination, ie position and orientation, of the surgical microscope relative to the patient is effected directly by processing the video image data of the surgical microscope.
- Optical or electromagnetic tracking systems for determining the patient position and the microscope position are typically not used, but can of course be used in further embodiments.
- the process directly transforms the microscope camera and patient determined, which eliminates additional calibration.
- Surgical microscopes are used in the operating room to enlarge structures in the operating area.
- the surgical intervention can be facilitated with the aid of the surgical microscope by inserting additional information into the image of the surgical microscope or by the surgical microscope or other auxiliary instruments being navigated in the image of the surgical microscope.
- additional information from the patient data set such as slice images from a CT or magnetic resonance tomography data set
- FIG. 6 shows this in a schematic side view.
- the optical pickup unit 16 with the microscope sensor 17 is positioned over the patient 19.
- the field of view 18 of the optical pickup unit 16 detects a surface contour, wherein for calculating a coordinate origin 21 of the microscope coordinate system and a coordinate origin 20 of the three-dimensional patient data set are determined.
- the relative position can then be determined by a transformation between the two coordinate systems.
- a relative position between the microscope in the optical recording unit 16 and the patient 19 is detected by trackers 22 by means of an optical tracking system.
- the position of the respective tracker 22 on the microscope and the position of the tracker 22 on the patient 19 are detected by a camera 23 and the position between the microscope track 22 and the
- Microscope lens determined by a hand-eye calibration and between the patient tracker 22 and the patient 19 using a patient registration.
- the method presented here has the objective of making Assist the microscope user by an augmented reality display of additional information such as a slice image from a CT data set, a display of a predefined target area or a structure from the three-dimensional data set.
- the calculation of the transformation required between the microscope objective and the patient 19 is carried out by processing the image sequence of the microscope.
- FIG. 8 shows a schematic view of the recognition of prominent points or structures in the first step. Distinct points 25 or structures are reliably repeatedly determined in an image sequence 26 from microscope images. This will be made of various
- FIG. 9 shows, in a view corresponding to FIG. 8, a determination of the prominent points 25 or structures in three-dimensional space.
- the recognized points or structures are reconstructed in three-dimensional space to provide a reference to the patient record.
- the determination of the camera position with respect to the points and structures in the three-dimensional space is shown in FIG.
- To determine the field of view of the microscope with respect to the 3D patient data set, the position of the camera with respect to the reconstructed points or structures in three-dimensional space is determined.
- an image 29 of a point of space 30 is generated by an optical center 31, which lies in the main plane of the microscope objective 28.
- FIG. 11 shows in a schematic view how an assignment of the prominent points or structures to points or structures takes place in the 3D data set.
- the transformation 34 between the microscope and the 3D data set results from matching the prominent points 25 or structures in the three-dimensional space to the corresponding points or structures in the 3D data set.
- a registration algorithm calculates the relative transformation 34 (rotation and translation) by correspondences 32 between reconstructed points 33 in three-dimensional space and points 25 in the preoperatively recorded 3D data record.
- FIG. 12 shows how the final step of the method, the preparation of additional information and the insertion into the
- Microscope image done. Since the field of view of the optical recording unit 16 or of the camera is known to the preoperatively recorded 3D patient data set, additional information from the 3D data set can be displayed in the microscope image as support for the surgeon.
- the microscope sensor 37 or the microscope image is hereby provided with an overlay 36 of the structure which, for example, identifies the marked structure 35 as the target structure.
- FIG. 13 shows a first method in a schematic representation, with which the determination of the position of prominent points 25 or structures in three-dimensional space can take place.
- a reconstruction of the surface 39 of the patient 19 and a localization of the optical recording unit 16 is achieved by recording a plurality of video images from a plurality of positions.
- an image on the microscope sensor 17 is achieved by a microscope objective 38.
- the control unit 15 configured as a data processing unit then displays the microscope image 37 together with an augmented reality visualization 41 on a screen 40.
- FIG. 14 shows a variant of the reconstruction in which a stereo image pair is recorded.
- Two microscope sensors 17 for stereo viewing are now provided and correspondingly a microscope objective 38 with two optical paths for stereo viewing.
- FIG. 15 shows the determination of characteristic points in the microscope coordinate system by localization of a
- Pointer instrument 42 in microscope image.
- the pointer 42 or another instrument is touched by feature points for patient registration.
- a localization of the pointer 42 or pointer instrument is carried out by image processing.
- FIG. 16 a further variant is shown in FIG. 16 in a representation corresponding to FIGS. 13 to 15, in which the instrument 42 is localized on the basis of the stereo images.
- the reference between patient surface 39 or points on the patient surface 39 and the 3D data set of the patient 19 is implemented by a registration algorithm.
- the localization of the pointer instrument 42 is effected by a stereo localization of the stereo images.
- the transformation 34 between the microscope including the microscope objective 38 and the 3D data set of the patient 19 is known
- a further aspect of the invention which, however, can also be implemented independently of the features discussed so far, concerns an embodiment of a marker.
- the following describes a marker-based image processing approach to medical instrument and device navigation that does not require external tracking hardware.
- An optical recording unit 16 as a measuring camera, for example a small video camera, is located directly on the surgical instrument to be navigated.
- the location of the optical pickup unit 16 relative to the patient 19 is determined by processing camera image data. Secure tracking is guaranteed by the artificial markers 44 glued or otherwise affixed to or on the patient 19. This makes the whole system not only simpler and faster to use, but also less expensive.
- Surgical navigation enables a live presentation of the position, ie the position and orientation in space, of surgical or medical instruments in a preoperatively recorded three-dimensional data Set a patient 19 (eg CT volume data) during surgery or diagnosis.
- 3D data can be generated or data of several modalities can be generated by the navigation of medical instruments from 2D sensors.
- An example application is a 2D ultrasound probe whose data is navigated to a
- 3D volume can be combined.
- Another example is the already described augmented reality overlays (for example, defined target regions, patient structures or slice images from a 3D data set) into the image of optical instruments being navigated. By a navigation of these instruments, z.
- endoscope or microscope the position of the optics relative to the patient 19 and thus the viewing direction of the optics is known to the patient record.
- the aim of the method is therefore a simple and robust position determination, which requires no complex system construction, including cumbersome registration processes.
- the approach is based on processing images that are either taken directly from the respective instrument or device mounted optical pickup unit 16. The processing then typically includes four steps: taking a picture sequence, calculating corresponding feature points in the pictures, calculating one
- the feature points are detectable by the artificially placed markers 44 in each image. Special algorithms calculate the most unique properties of these points. Based on these properties, corresponding feature points can then be repeatedly detected in different images. By triangulating points found in several images, a 3D model of the considered surface is calculated from the feature points. Finally, the 3D points of this model can be used to calculate the respective position of the optical pickup unit 16 relative to the 3D model, ie the projection matrix that projects the 3D points to the correct position in the new image describes the correct rotation and translation between the optical pickup units 16. This situation is also shown in FIG. Two optical recording units 16 detect the markers 44 arranged on the patient 19. A projection 45 of the markers 44 can then be determined in image planes 43 of the optical recording units 16.
- Figure 18 shows in plan view several embodiments of structures of the markers 44, all of which have an outer black border, an inner white border, and an inner dot pattern. After the inner white edge, an additional black border can be provided again.
- the dot pattern as well as the edges may be constructed of square, circular, rectangular or elliptical pixels.
- the illustrated structures correspond to "finder patterns", which can be reliably detected by means of algorithms. In order to be able to reliably determine a center point, in the exemplary embodiments illustrated, each of the patterns is constructed symmetrically.
- the marker structures are based on the "finder patterns" of Q.R codes.
- the artificial feature points have two important features: A simple and robust
- the simple detectability with image processing algorithms is given by the properties of the "Finder Patterns", which can be found by means of a contour recognition multiple nested contours, and the artificial marker can be clearly defined.
- the distinctiveness results from the differently defined structures in the middle of the respective marker 44.
- the peculiarity of the method is that the markers can be distinguished by means of conventional feature descriptors, since the gradients in the image of the environment of the center of each Markers 44 differ by the defined structure. From each marker found, the midpoint is calculated. This point is used as a feature point, resulting in a sequence of four process steps: finder-patter detection by contour recognition,
- both the selection of patterns, as well as the order and arrangement and, to a certain extent, the size of the placed points can be arbitrary, as long as a minimum number of markers 44 is always visible in the recorded images or recordings.
- FIG. 19 shows a perspective view of an application example in which an instrument 42, in the case illustrated a biopsy needle for performing a lung biopsy, has been provided with the optical recording unit 16.
- marker 44 can be determined at any time the camera and biopsy needle position.
- Possibilities for applying the dots are, for example, pads with printed patterns or slightly sticky dots that can be attached to the patient 19 and easily removed after the procedure. The positions of these points should not change during the current application. These additional points can be used to implement a more robust position estimate, since it can be guaranteed that corresponding point correspondences can be determined in each recorded image.
- FIG. 20 shows a top view of a patient head 46 together with a medical instrument 42 (now an endoscope), the markers 44 and the optical recording unit 16 attached to the instrument 42.
- a medical instrument 42 now an endoscope
- FIGS. 21 and 22 each show a schematic view of a navigated ultrasound application and a navigated microscopy.
- the microscope sensor 17 can be used to make recordings that allow conclusions about the movement of the microscope sensor 17 directly.
- three-dimensional reconstructions or additional information can be superimposed into the microscope images.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Robotics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Automation & Control Theory (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
La présente invention concerne un dispositif d'imagerie dans une salle d'opération, comprenant une source de rayons X (4), un détecteur de rayons X (5), un bras robotisé (1) sur lequel la source de rayons X (4) et le détecteur de rayons X (5) sont agencés sur un support (2) situé sur le bras robotisé (1) et peuvent être déplacés sur ledit support (2) ou dans celui-ci, ainsi qu'une unité de commande (15) permettant de commander le bras robotisé (1).
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102016112388.9 | 2016-07-06 | ||
| DE102016112374 | 2016-07-06 | ||
| DE102016112395.1 | 2016-07-06 | ||
| DE102016112388 | 2016-07-06 | ||
| DE102016112395 | 2016-07-06 | ||
| DE102016112374.9 | 2016-07-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018007091A1 true WO2018007091A1 (fr) | 2018-01-11 |
Family
ID=59091481
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2017/063981 Ceased WO2018007091A1 (fr) | 2016-07-06 | 2017-06-08 | Dispositif d'imagerie dans une salle d'opération |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2018007091A1 (fr) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11540887B2 (en) | 2020-06-05 | 2023-01-03 | Stryker European Operations Limited | Technique for providing user guidance in surgical navigation |
| US11571225B2 (en) | 2020-08-17 | 2023-02-07 | Russell Todd Nevins | System and method for location determination using movement between optical labels and a 3D spatial mapping camera |
| US11600053B1 (en) | 2021-10-04 | 2023-03-07 | Russell Todd Nevins | System and method for location determination using a mixed reality device and multiple imaging cameras |
| US11806081B2 (en) | 2021-04-02 | 2023-11-07 | Russell Todd Nevins | System and method for location determination using movement of an optical label fixed to a bone using a spatial mapping camera |
| US12236536B2 (en) | 2020-08-17 | 2025-02-25 | Russell Todd Nevins | System and method for location determination using a mixed reality device and a 3D spatial mapping camera |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2009036174A2 (fr) * | 2007-09-13 | 2009-03-19 | Henderson Toby D | Système de positionnement par imagerie comportant un bras d positionné par robot |
| US20110280379A1 (en) * | 2010-05-14 | 2011-11-17 | Michael Maschke | Imaging apparatus comprising a ring-shaped gantry |
| US20110280364A1 (en) * | 2010-05-14 | 2011-11-17 | Michael Maschke | Medical examination device for CT imaging and for nuclear medical imaging |
| WO2013160303A2 (fr) * | 2012-04-25 | 2013-10-31 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Source de rayons x pourvue d'un module et d'un détecteur pour rayonnement optique |
| DE102015212352A1 (de) | 2015-07-01 | 2017-01-05 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Verfahren, Anordnung und Computerprogrammprodukt zur Lageerfassung eines zu untersuchenden Objekts |
-
2017
- 2017-06-08 WO PCT/EP2017/063981 patent/WO2018007091A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2009036174A2 (fr) * | 2007-09-13 | 2009-03-19 | Henderson Toby D | Système de positionnement par imagerie comportant un bras d positionné par robot |
| US20110280379A1 (en) * | 2010-05-14 | 2011-11-17 | Michael Maschke | Imaging apparatus comprising a ring-shaped gantry |
| US20110280364A1 (en) * | 2010-05-14 | 2011-11-17 | Michael Maschke | Medical examination device for CT imaging and for nuclear medical imaging |
| WO2013160303A2 (fr) * | 2012-04-25 | 2013-10-31 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Source de rayons x pourvue d'un module et d'un détecteur pour rayonnement optique |
| DE102015212352A1 (de) | 2015-07-01 | 2017-01-05 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Verfahren, Anordnung und Computerprogrammprodukt zur Lageerfassung eines zu untersuchenden Objekts |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11540887B2 (en) | 2020-06-05 | 2023-01-03 | Stryker European Operations Limited | Technique for providing user guidance in surgical navigation |
| US11571225B2 (en) | 2020-08-17 | 2023-02-07 | Russell Todd Nevins | System and method for location determination using movement between optical labels and a 3D spatial mapping camera |
| US12236536B2 (en) | 2020-08-17 | 2025-02-25 | Russell Todd Nevins | System and method for location determination using a mixed reality device and a 3D spatial mapping camera |
| US12290271B2 (en) | 2020-08-17 | 2025-05-06 | Russell Todd Nevins | System and method for location determination using movement between optical labels and a 3D spatial mapping camera |
| US11806081B2 (en) | 2021-04-02 | 2023-11-07 | Russell Todd Nevins | System and method for location determination using movement of an optical label fixed to a bone using a spatial mapping camera |
| US11871997B2 (en) | 2021-04-02 | 2024-01-16 | Russell Todd Nevins | System and method for location determination using movement of an optical label fixed to a bone using a spatial mapping camera |
| US11600053B1 (en) | 2021-10-04 | 2023-03-07 | Russell Todd Nevins | System and method for location determination using a mixed reality device and multiple imaging cameras |
| US11610378B1 (en) | 2021-10-04 | 2023-03-21 | Russell Todd Nevins | System and method for location determination using a mixed reality device and multiple imaging cameras |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP2840975B1 (fr) | Source de rayons x pourvue d'un module et d'un détecteur pour rayonnement optique | |
| EP3330922B1 (fr) | Procede et dispositif destines a la representation d'un objet | |
| DE69322202T2 (de) | System und Verfahren zur Verbesserung von endoskopischer Chirurgie | |
| EP2082687B1 (fr) | Représentation superposée de saisies | |
| DE69431875T2 (de) | Anordnung zur bestimmung der gegenseitigen lage von körpern | |
| EP1803399B1 (fr) | Procédé et dispositif destinés à la détermination de la position instantanée d'une structure d'un objet examiné dans un système de coordonnées | |
| WO2014068106A1 (fr) | Système d'imagerie, dispositif d'exploitation comportant le système d'imagerie, et procédé d'imagerie | |
| WO2012049038A1 (fr) | Système de navigation chirurgicale à lumière structurée | |
| EP0682919A2 (fr) | Méthode pour la corrélation de plusieurs systèmes de coordinates dans la chirurgie stéréotactique assistée par ordinateur | |
| WO2018007091A1 (fr) | Dispositif d'imagerie dans une salle d'opération | |
| DE9117261U1 (de) | Am Kopf anzuwendendes Lokalisationssystem für eine chirurgische Sonde | |
| EP0799434A1 (fr) | Microscope, notamment stereomicroscope, et procede permettant de superposer deux images | |
| WO1994003100A1 (fr) | Procede de visualisation de l'interieur de corps | |
| CH684291A5 (de) | Operationsmikroskop zur rechnergestützten, stereotaktischen Mikrochirurgie, sowie Verfahren zu dessen Betrieb. | |
| WO2011144412A1 (fr) | Détermination et vérification de la transformation des coordonnées entre un système radiographique et un système de navigation chirurgicale | |
| EP3626176B1 (fr) | Procédé d'assistance d'un utilisateur, produit programme informatique, support de données et système d'imagerie | |
| WO2008058520A2 (fr) | Dispositif de génération d'images pour un opérateur | |
| WO2019149400A1 (fr) | Procédé de planification de la position d'un système d'enregistrement d'un appareil d'imagerie médicale, et appareil d'imagerie médicale | |
| DE102014210046A1 (de) | Operationsmikroskopsystem | |
| EP2111814B1 (fr) | Procédé d'enregistrement d'un ensemble de données d'image en 2D généré par des rayons de reproduction en forme d'éventail dans le domaine de la médecine et produit de programme informatique correspondant ainsi que procédé et système d'enregistrement automatique d'un corps basé sur des données d'image en 2D pour l'utilisation avec des systèmes de navigation médicaux | |
| DE102008009266A1 (de) | Kalibrierung einer Instrumentenlokalisierungseinrichtung mit einer Bildgebungsvorrichtung | |
| DE102020200959A1 (de) | Aufnahme eines Panoramadatensatzes eines Untersuchungsobjektes mittels eines beweglichen medizinischen Röntgengerätes | |
| DE102023101117A1 (de) | Verfahren zur medizintechnischen Navigation | |
| DE102012200686A1 (de) | Verfahren und Vorrichtung zur Positionierung einer Röntgenvorrichtung | |
| DE102014223386A1 (de) | Roboterregistrierung bei bildgebenden Modalitäten |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17731495 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17731495 Country of ref document: EP Kind code of ref document: A1 |