EP2017756A1 - Method for displaying and/or processing or manipulating image data for medical purposes with gesture recognition - Google Patents
Method for displaying and/or processing or manipulating image data for medical purposes with gesture recognition Download PDFInfo
- Publication number
- EP2017756A1 EP2017756A1 EP07014276A EP07014276A EP2017756A1 EP 2017756 A1 EP2017756 A1 EP 2017756A1 EP 07014276 A EP07014276 A EP 07014276A EP 07014276 A EP07014276 A EP 07014276A EP 2017756 A1 EP2017756 A1 EP 2017756A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- inputs
- screen
- screen surface
- input
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Z—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS, NOT OTHERWISE PROVIDED FOR
- G16Z99/00—Subject matter not provided for in other main groups of this subclass
Definitions
- the invention relates to a method for displaying and / or processing or processing image data of medical or medical technology origin.
- Medical or medical imaging material produced by medical imaging techniques such as e.g. Computed tomography, magnetic resonance imaging or X-ray machines (two-dimensional / three-dimensional) are increasingly being stored as digital image material or as a digital image data record.
- Systems used for this purpose bear the name "Picture Archiving and Communication System (PACS)".
- PACS Picture Archiving and Communication System
- the primary consideration or evaluation of such digital images is currently limited to radiologists who work in designated viewing rooms with high-resolution monitors of high luminosity, which can be achieved in hospitals savings in terms of the used film material.
- a display system for medical images which is constructed in the manner of a showcase or a lightbox, is from the US-2002/0039084 A1 known.
- various options for manipulating the medical images are specified, such as inputs via a separate control panel, remote controls, simple touch screen applications or voice control.
- the invention provides a digital lightbox with an optimized command input system that relies on the processing of user-executed gestures that can be performed directly on the screen or are detected by detection associated directly with the screen.
- gestures are inputs which have a special meaning in their nature or to which a specific meaning can be assigned by the display device or its components.
- Such gesture recognition along with the input on-screen input screen associated with the screen, allows the user to quickly and intuitively view the picture, makes picture viewing systems suitable for operating theaters, particularly because the necessary sterility can be maintained.
- Image viewing systems using the method according to the invention can be provided wall-mounted in the manner of showcases or light boxes, thus providing the user with his usual working environment. Complicated and expensive or difficult to sterilize operating means such as mice and keyboards or input keypads need no longer be provided or operated, with the gesture recognition provides a way to more diverse viewing and image manipulation than was the case with conventional systems in general.
- the FIG. 1 shows a schematic representation of a digital lightbox, with which the method according to the present invention can be implemented.
- the digital lightbox (display device) 1 comprises two separate screens or screen parts 2, 3 and an integrated computer data processing unit 4, which is shown only schematically.
- image data records can be loaded into the lightbox 1 via the computer data processing unit 4, and on the other hand, it controls the representation of the image data records, in accordance with the input rules, which will be discussed in more detail below with reference to many examples.
- the data processing unit 4 may optionally detect changes or additions made by the inputs and may alter the records accordingly.
- the screens or screen parts 2, 3 are in this Example according to the invention designed as a so-called multi-touch screens.
- multiple inputs can be detected simultaneously, such as inputs to various screen locations or areal inputs.
- the screen can capture inputs through actual touches of the screen surface or presences near the screen surface, which is made possible, for example, by the construction of an infrared ray grid.
- the two screens or monitors 2 and 3 are arranged side by side, wherein the smaller monitor 3 in a preferred embodiment, a control interface (for example, for data transfer, the input command assignment or the selection of images or image data) provides and on the larger monitor the Pictures themselves are presented.
- the width of the smaller monitor 3 corresponds to the height of the larger monitor 2, and the smaller monitor 3 is rotated by 90 degrees, whereby a large operating surface is formed.
- FIG. 2 shows a screen section 15, on which an image 14 is displayed, here is a schematic patient's head image. It is the hand 10 of an operator to see, with the left index finger of the area of the second phalanx as areal area with the reference numeral 13 and the tip of the index finger are indicated as point 11.
- an operator can now perform, for example, a two-dimensional touch of the screen with the index finger area 13 (or also with the entire finger). On the other hand, the touch can be done, for example, only with the fingertip 11.
- this term should at least include the two above-mentioned types of input on the screen, namely on the one hand an actual touch of the screen and on the other hand the generation of a Presence on the screen surface or at a (moderate) distance from the screen surface.
- the operator can carry out different input gestures, which on the one hand can comprise punctiform touches and, on the other hand, areal touches. Such different inputs can also be interpreted differently, and this fact provides the operator with a further dimension for data entry.
- Some examples of different input interpretations that may be associated with areal touch or punctiform touch and that may be distinguished by the types of touch are moving images on the screen, selecting a position in a scrollbar on another, similar user interface; the movement of a scrollbar pointer to a selected position for faster selection in a scroll field; playing or pausing animated image sequences, or even selecting options in a box with multiple (scrollable) options, such as changing the sort type.
- FIGS. 3a to 3d show possible uses of the present invention in the image viewing.
- the FIG. 3a shows, for example, how a selected image 14 can be influenced by a touch with one or two fingertips 11, 12 of a hand, and an example of such an influence could be the modification of the brightness and the contrast by an interaction of gestures that coincide with the fingertips 11, 12 are performed.
- the brightness can be adjusted by touching a single fingertip on the screen and then performing a horizontal movement, while a vertical movement will adjust the contrast.
- Another example would be a disjointing and bringing together of the fingertips 11, 12 when the program is set to respond appropriately to such gestures.
- FIG. 3b shows how a certain screen detail with the help of two fingertips 11, 21 of two hands 10, 20 can be selected, which is visually represented by the rectangular outline 23.
- the outline 23 can be generated, for example, by a simultaneous contact with the two fingertips 11, 21.
- corresponding command assignments can be stored and assigned in the gesture recognition software of the computer data processing unit 4, and that it is also possible to change such assignments, for example, by preselecting a particular interpretation on the left small screen 3 of FIG Lightbox 1. This can basically apply to all other or previous embodiments as well.
- a magnifying feature according to the present invention can be understood from the FIGS. 4a to 4c be explained.
- the gesture recognition may include an association in which a first screen touch with the fingertip 21 enlarges an area in the vicinity of the touch point, which is then displayed in the manner of a magnifier with the edge 29.
- the text 27 is enlarged in this area, and there is a possibility - see Figure 4c to make a selection in the text with a second touch in parallel or subsequently (and then simultaneously) to the first touch, for example the selection of a hyperlink if the second touch occurs within the enlarged area.
- the second touch however, another process can alternatively be triggered, for example the marking of an image location, which does not necessarily have to be a text module, but can also be a specific part in an anatomical representation.
- a variant in which a polygonal draft is generated by means of a method according to the invention is disclosed in US Pat FIGS. 5a to 5d to see.
- a series of touches triggers the selection or definition of a region of interest, for example a bone structure in a medical image 14.
- the first Touch 31 is interpreted as the starting point for the region of interest or traverse, and as long as the first point 31 remains active (this may or may not necessarily require a fingertip to remain on the first point) then the following touches will be considered as further points on the boundary line of interest Area interpreted.
- By returning to the first point via further points 32, 33, etc. it can be indicated that the area is completely defined; but this can also be done by another touch sequence or by removing all touch.
- the region of interest is then marked with the polygon 35 ( FIG. 5d ).
- FIGS. 6a to 6d Another image manipulation is based on the FIGS. 6a to 6d shown, namely the mirroring or tilting of an image 14 on the lightbox 1.
- the two Figures 6a and 6b show how an image 14 can be tilted about a horizontal axis by tilting a virtual button 40 provided separately for this gesture horizontally from the bottom to the top by means of a fingertip. If the displacement is in the horizontal direction, a corresponding tilt can take place about a vertical axis. After performing the tilting operation, the button remains at the displaced location 40 ', so that it can also serve as an indication that the image has been tilted or mirrored.
- FIGS. 6c and 6d A two-handed tilting or mirroring gesture is in the FIGS. 6c and 6d demonstrated.
- this embodiment interprets this as a command to flip the image 14 about a vertical axis.
- a mirroring about a horizontal axis is possible by a corresponding, opposite finger movement in the vertical direction.
- FIGS. 7a and 7b The input shown concerns the invocation of an otherwise hidden menu field 45 by a first fingertip touch 11 (FIG. Figure 7a ), whereupon with a second touch a selection in the unfolded menu can be done, for example, here the selection of the middle command field 46th
- the embodiment according to the FIGS. 8a to 8c concerns the input of characters via an on-screen keyboard. More keystrokes can be made than with the standard 101-key keypads, for example, the input of all 191 characters according to IS08859-1 can be supported by assigning multiple characters to a virtual key. The characters are assigned using similarity criteria, for example, the character E is assigned several other E characters with different accents. After the character E is selected on the keyboard portion 52, an additional keyboard portion 54 offers various alternative characters ( FIG. 8b ), while the character E in its basic form is already written in the control output 50. If then, as in FIG. 8c If a special character E with an accent from row 54 is selected, the last character entered is replaced by this special character.
- FIGS. 9a to 9d Scroll bar operation using the present invention will be described with reference to FIGS FIGS. 9a to 9d explained.
- the scroll bar 61 comprises a scroll arrow or scroll area 62
- FIG. 9d is still the list 60 extended, namely a number column 63. Scrolling through the list 60 can be done according to the invention by touching the scrollbar 61 in the arrow area 62, and by the fingertip 21 is guided downward, is scrolled down in the list 60 like that FIGS. 9a and 9b can be seen. Pulling the fingertip 21 with touch on the screen leads to this process.
- the list jumps to a corresponding relative position and the selected one Area is displayed.
- the display or scroll order may be changed by other than a punctiform selection.
- a surface contact with the second phalanx 23 results in that a second list 63 is opened, can be scrolled by moving up and down of the finger.
- FIGS. 10a to 10c A variant of the invention in which diagrams are manipulated, is in the FIGS. 10a to 10c shown.
- the diagram 70 here an ECG of a patient has, for example, the tip 72 ( FIG. 10a ). If a user wants to know more about the value at this point, he can - as in FIG. 10b - select the point 72 by the circling with his fingertip 21, whereupon, for example, the selection circuit 74 appears to confirm. In response to this selection, the computer data processing unit can now output on the diagram axes 74, 76 the values concerning the tip 72, here 0.5 on the axis 74 and 54 on the axis 76. Similar evaluations are for other measurements or for example Properties such as color values of the selected point or a selected surface are possible.
- FIGS. 11a and 11b show two different ways to select a chart area.
- the chart area is selected by two fingertip touches 11, 21 on the bottom axis, and the height of the selection area 77 is automatically determined to include the important chart portions.
- a selection in which the height itself is chosen is for the range 78 in FIG. 11d to see, with the fingertip contacts 11 and 21 define opposite corners of a rectangular area 78.
- already existing Selections such as the selection area 79 ( FIG. 11e ), which is changed by moving the fingertip 11 into the area 79 '.
- FIG. 12 shows how using the present invention of a lightbox or their data processing unit can be informed whether the user is right- or left-handed.
- a planar application of the hand 20 on a screen area 17 causes multiple touches, and by detecting the size of different points of contact and the distances between the touches can - for example, by a model comparison - to determine whether it is a right or a left hand.
- the user interface or the display can be adjusted accordingly so that it is convenient and optimally manageable for the respective type.
- the fact that this is such a determination can be determined by the data processing unit in one embodiment, for example, when a hand rest takes place over a certain period of time.
- FIGS. 13a to 13c show examples of this.
- the user can bring two fingertips 21, 22 into contact with the screen, and a line 80 is drawn by this gesture. If the user now - as in FIG. 13b - The fingertips 21, 22 further apart, the right angle to the fingertip connection defined line is stretched, the line length is thus defined relative to the fingertip distance.
- a ruler 82 are generated whose scale depends on the distance of the fingertips 21, 22.
- the meaning of the input gesture may depend on an input mode to be previously selected or even gesture-identifiable or gesture-identifiable.
- FIGS. 14a to 14h Two- and three-dimensional image manipulations, as can be carried out with the aid of the present invention, are exemplary in the FIGS. 14a to 14h demonstrated.
- an object can be manipulated, which is displayed on the screen as a three-dimensional model or three-dimensional reconstruction of a patient scan.
- FIG 14a So is in Figure 14a shown how a cutting plane on a brain 84 can be determined and displayed.
- the cutting plane 88 is a plane pointed by the arrow 85.
- the arrow 85 is generated by two fingertip contacts 21, 22, and its length depends on the distance of the fingertips 21, 22. It leads perpendicular to the plane 88.
- the fingertips 21, 22 moved further apart or brought closer to each other, the position of the cutting plane 88 changes, and a corresponding sectional image can be additionally displayed next to, as indicated by the reference numeral 86.
- the representation 86 can be "scrolled" as an orthogonal sectional plane through different cutting planes.
- the Figures 14b and 14c show how the displacement of two contacts in a rotary motion rotates a three-dimensional object about an axis parallel to the line of sight and centered on the line between the two contacts.
- FIG. 14f shows how two two-finger lines 87, 87 'can be used, similar to Figure 14a Create cutting planes, where a three-dimensional object wedge can be defined.
- the Figures 14g and 14h show, finally, that the rotation or rotation processes described can also be applied to two-dimensional representations that originate from a three-dimensional data set or have been assigned to each other in some other way.
- the representation 89 is 90 degrees from the state of Figure 14g to the state of Figure 14h turned.
- dataset orientations can be changed between sagittal, axial and coronal. For example, if there is a sagittal image, positioning the finger touches on the top of the image and pulling the contact downwards would prevent alignment with an axial alignment.
- FIGS. 15a and 15b show another embodiment in which first a GUI (Graphic User Interface) element 98 is selected from a selection 97 by a fingertip touch to select a label, after which the fingertip contact with the other hand 10 then labels 99 at the desired one Place attach.
- GUI Graphic User Interface
- FIG. 16 how to request an erase confirmation for the image 100 and triggers according to the invention, namely by a two-handed touch of the buttons 104 and 106 according to a request 102.
- FIG. 17 shows an application case in which a real object can be measured, for example, the pointing device 110, which on the screen portion 19th is brought. If a corresponding mode is set or the object 110 remains on the screen for a longer period of time, it can be triggered by measuring the contact area or counting the number of contacts and corresponding object dimensions can be detected.
- FIGS. 18a and 18b show how a geometric object, here a circle on the screen, can be generated with the aid of corresponding gestures.
- Circle 112 is created by pointing a fingertip to midpoint 114 and a fingertip to a perimeter point 116, while in FIG. 18b a circle 120 is entered through three perimeter points 122, 123, and 124.
- implant 130 can be altered, for example, by enlargement, reduction, or rotation gestures, as described above. If other image data is available on the screen, eg anatomy structures, into which the implant can be inserted, the appropriate implant size can already be planned in advance on the screen. It is then also possible to let the computer compare the adapted implant with different stored and available implant sizes. If a suitable implant is already available and the database outputs the corresponding, exactly this implant can be chosen or ordered, or necessary adjustments of the next larger implant are already calculated and output.
- a gesture can be interpreted differently and depending on the part of the image to which the gesture is applied.
- the displayed image has a bright head outline 134 and a dark area 132, and when the finger is pointed at the bright area, this gesture may be considered, for example, as a command to scroll through different cutting planes as the finger is drawn across the light area becomes ( FIG. 20b ), eg if the finger with his But tip 21 is placed on the dark area, this gesture is interpreted as a command to move the image, as in FIG. 20c is shown.
- the gesture recognition can be used to display and set a clock or a countdown, and in FIG. 21
- a two-finger touch will make a countdown clock 140 appear on the monitor. Then, when the index finger is rotated around the thumb, this results in a shift of the pointer 142, and the countdown can start at this preset time.
- Another application is a signature that is entered by multiple touch of the screen. If a line sequence is entered simultaneously or consecutively with two hands and corresponding gestures, this can be used for a very unambiguous identification of the user.
- FIGS. 23a to 23c concerns a multiple selection of picture elements or image objects or the handling of such elements or objects.
- a number of image objects namely smaller images 150
- the first image 152 and the last image 154 of an image sequence to be selected are selected in a corresponding selection mode.
- the first contact with the hand 10 in the image 152 remains active until the image 154 has also been selected.
- the plurality of images can now be fed to different edits or used in different ways.
- FIG. 23b shown. Namely, if from the state of FIG.
- a compressed file can be generated that includes all the selected individual representations and that in FIG. 23b designated by reference numeral 156.
- FIG. 23c The application shown is, for example, the playback of a movie or a sequence of selected files, and this can also be done by a corresponding gesture or by activating a play button.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Epidemiology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Die Erfindung betrifft ein Verfahren zur Anzeige und/oder Bearbeitung bzw. Verarbeitung von Bilddaten medizinischen oder medizintechnischen Ursprungs.The invention relates to a method for displaying and / or processing or processing image data of medical or medical technology origin.
Medizinisches oder medizintechnisches Bildmaterial, das mit Hilfe medizintechnischer Bildgebungsverfahren erzeugt wird, wie z.B. mit Computertomographie, Kernspintomographie oder Röntgengeräten (zweidimensional/dreidimensional) wird in immer größerem Umfang als digitales Bildmaterial bzw. als digitaler Bilddatensatz gespeichert. Hierfür verwendete Systeme tragen die englische Bezeichnung "Picture Archiving and Communication System (PACS)". Die primäre Betrachtung bzw. Auswertung solcher digitalen Bilder beschränkt sich derzeit auf Radiologen, die in dafür vorgesehenen Betrachtungsräumen mit hochauflösenden Monitoren hoher Leuchtkraft arbeiten, wodurch in Krankenhäusern Einsparungen bezüglich des verwendeten Filmmaterials erzielt werden können.Medical or medical imaging material produced by medical imaging techniques, such as e.g. Computed tomography, magnetic resonance imaging or X-ray machines (two-dimensional / three-dimensional) are increasingly being stored as digital image material or as a digital image data record. Systems used for this purpose bear the name "Picture Archiving and Communication System (PACS)". The primary consideration or evaluation of such digital images is currently limited to radiologists who work in designated viewing rooms with high-resolution monitors of high luminosity, which can be achieved in hospitals savings in terms of the used film material.
Außerhalb der Radiologie schreitet der Übergang zur filmlosen Bildbetrachtung langsamer fort. Es werden beispielsweise Bilder, die in der Radiologie filmlos betrachtet werden, auf Film ausgedruckt, um einer sekundären Verwendung in anderen Abteilungen zugänglich gemacht zu werden. Dies mag einerseits daran liegen, dass PACS-Computerprogramme sehr auf Radiologen abgestimmt sind; andererseits ist ihre Bedienung oftmals kompliziert. Hinzu kommt noch, dass viele Ärzte daran gewöhnt sind, mit einem von hinten beleuchteten Schaukasten zu arbeiten, der auch "Lightbox" genannt wird.Outside of radiology, the transition to filmless image viewing is progressing more slowly. For example, images that are viewed filmless in radiology are printed on film to be made available for secondary use in other departments. On the one hand, this may be because PACS computer programs are very much geared to radiologists; On the other hand, their operation is often complicated. In addition, many doctors are used to working with a backlit showcase, also known as "Lightbox".
Es sind Anstrengungen unternommen worden, digitales Bildmaterial für die sekundäre Verwendung außerhalb der Radiologie besser zugänglich zu machen, beispielsweise durch Großbildmonitore in Operationssälen, die durch kabellose Tastaturen oder Mäuse bedient werden können. Auch einfache Touchscreen-Bedienungen werden verwendet, oder es werden separate Kameras bereitgestellt, welche Steuerungseingaben von Ärzten oder Bedienungspersonal erkennen können. Ein Anzeigesystem für medizinische Bilder, das in der Art eines Schaukastens bzw. einer Lightbox aufgebaut ist, ist aus der
Es ist die Aufgabe der vorliegenden Erfindung, die Betrachtung und Manipulation von Bildern bzw. Bilddaten medizinischen oder medizintechnischen Ursprungs, beispielsweise Patientenbilddaten, einfach und intuitiv zu gestalten und insbesondere auch in Operationssälen bzw. für die sekundäre Verwendung nach der Radiologie in optimierter Weise zugänglich zu machen.It is the object of the present invention to make the viewing and manipulation of images or image data of medical or medical technology origin, for example patient image data, simple and intuitive and in particular to make them accessible in operating theaters or for secondary use after radiology in an optimized manner ,
Diese Aufgabe wird erfindungsgemäß durch ein Verfahren gemäß dem Anspruch 1 gelöst. Die Unteransprüche definieren bevorzugte Ausführungsformen der Erfindung.This object is achieved by a method according to
Bei dem erfindungsgemäßen Verfahren zur Anzeige und/oder Bearbeitung bzw. Verarbeitung von Bilddaten medizinischen oder medizintechnischen Ursprungs wird ein Anzeigegerät mit mindestens einem Bildschirm verwendet, wobei
- die Bilddaten durch eine im Anzeigegerät integrierte Computer-Datenverarbeitungseinheit verarbeitet werden, um Bildausgaben zu erzeugen und/oder die Bilddaten zu ändern bzw. zu bestätigen,
- die Bilddaten durch Eingaben am Bildschirm selbst manipuliert, erzeugt oder aufgerufen werden, und wobei
- die Eingaben mit Hilfe der Datenverarbeitungseinheit über eine Gestenerkennung identifiziert werden, wobei die Gesten per Hand oder mit Gestenerzeugungsgeräten erzeugt werden.
- the image data is processed by a computer data processing unit integrated in the display device to generate image output and / or to modify or confirm the image data;
- the image data are manipulated, generated or called up by inputs on the screen itself, and wherein
- the inputs are identified by means of the data processing unit via gesture recognition, the gestures being generated manually or with gesture-generating devices.
Anders ausgedrückt stellt die Erfindung eine digitale Lightbox mit einem optimierten Befehlseingabesystem zur Verfügung, das auf der Verarbeitung von durch einen Benutzer ausgeführten Gesten beruht, die unmittelbar am Bildschirm ausgeführt werden können oder durch eine Erfassung detektiert werden, die unmittelbar dem Bildschirm zugeordnet ist. Solche Gesten sind im Sinne der vorliegenden Erfindung Eingaben, denen ihrer Natur nach eine spezielle Bedeutung zukommt, oder denen vom Anzeigegerät oder seinen Komponenten eine spezielle Bedeutung zugeordnet werden kann.In other words, the invention provides a digital lightbox with an optimized command input system that relies on the processing of user-executed gestures that can be performed directly on the screen or are detected by detection associated directly with the screen. For the purposes of the present invention, such gestures are inputs which have a special meaning in their nature or to which a specific meaning can be assigned by the display device or its components.
Eine solche Gestenerkennung zusammen mit der Eingabe am Bildschirm durch dem Bildschirm zugeordnete Eingabe-Erkennungsmittel gestattet dem Benutzer eine schnelle und intuitive Bildbetrachtung, sie macht Bildbetrachtungssysteme geeignet für Operationssäle, insbesondere weil die notwendige Sterilität aufrechterhalten werden kann. Bildbetrachtungssysteme, die das erfindungsgemäße Verfahren verwenden, können in der Art von Schaukästen bzw. Lightboxen wandmontiert bereitgestellt werden und so dem Benutzer sein gewohntes Arbeitsumfeld bereitstellen. Komplizierte und aufwändige bzw. schwer zu sterilisierende Bedienungsmittel wie Mäuse und Tastaturen oder Eingabetastenfelder müssen nicht mehr bereitgestellt oder bedient werden, wobei darüber hinaus die Gestenerkennung eine Möglichkeit zur vielfältigeren Betrachtung und Bildmanipulation bietet als dies bei herkömmlichen Systemen generell der Fall war.Such gesture recognition, along with the input on-screen input screen associated with the screen, allows the user to quickly and intuitively view the picture, makes picture viewing systems suitable for operating theaters, particularly because the necessary sterility can be maintained. Image viewing systems using the method according to the invention can be provided wall-mounted in the manner of showcases or light boxes, thus providing the user with his usual working environment. Complicated and expensive or difficult to sterilize operating means such as mice and keyboards or input keypads need no longer be provided or operated, with the gesture recognition provides a way to more diverse viewing and image manipulation than was the case with conventional systems in general.
Die Erfindung wird nunmehr anhand mehrerer Ausführungsformen näher erläutert. Sie kann alle hierin beschriebenen Merkmale einzeln sowie in jedweder sinnvollen Kombination umfassen: In den beiliegenden Zeichnungen zeigen:
Figur 1- ein schematisches Abbild der erfindungsgemäßen digitalen Lightbox;
Figur 2- eine Darstellung einer flächenartigen Eingabe;
- Figuren 3a bis 3d
- Bildbetrachtungsbeispiele;
- Figuren 4a bis 4c
- ein Beispiel für eine Bildschirmausschnitt-Vergrößerung;
- Figuren 5a bis 5d
- ein Beispiel für die Erzeugung eines Polygonzuges;
- Figuren 6a bis 6d
- Beispiele für das Spiegeln bzw. Kippen eines Bildes;
- Figuren 7a und 7b
- Beispiele für das Aufrufen eines versteckten Menüs;
- Figuren 8a bis 8c
- Beispiele für eine Bildschirmtastaturbedienung;
- Figuren 9a bis 9d
- Beispiele für eine Scroll-Bedienung;
- Figuren 10a bis 10c
- ein Beispiel für eine Punktanwahl in einem Diagramm;
- Figuren 11a bis 11f
- Beispiele für Diagrammmanipulationen;
Figur 12- ein Beispiel für eine Linkshänder/Rechtshänder-Erkennung;
- Figuren 13a bis 13c
- Beispiele für eine Linienerzeugung bzw. -manipulation;
- Figuren 14a bis 14h
- Beispiele für die Manipulation von Bilddarstellungen für Patientendatensätze;
- Figuren 15a bis 15d
- Beispiele für Punktzuordnungen;
- Figur 16
- ein Beispiel für eine Befehlsbestätigung;
Figur 17- ein Beispiel für eine Objektvermessung;
- Figuren 18a und 18b
- Beispiele für eine Kreiskontur-Erzeugung;
Figur 19- ein Beispiel für die Manipulation eines Implantats;
Figuren 20a bis 20c- ein Beispiel für eine bildinhaltsabhängige Eingabeinterpretation;
Figur 21- ein Beispiel für eine Countdown-Einstellung;
Figur 22- ein Beispiel für eine Signatureingabe; und
- Figuren 23a bis 23c
- Beispiele für die Manipulation mehrerer Bildelemente.
- FIG. 1
- a schematic image of the digital lightbox according to the invention;
- FIG. 2
- a representation of a planar input;
- FIGS. 3a to 3d
- Image viewing examples;
- FIGS. 4a to 4c
- an example of a screen detail magnification;
- FIGS. 5a to 5d
- an example of the generation of a polygon train;
- FIGS. 6a to 6d
- Examples of mirroring or tilting an image;
- FIGS. 7a and 7b
- Examples of invoking a hidden menu;
- FIGS. 8a to 8c
- Examples of on-screen keyboard operation;
- FIGS. 9a to 9d
- Examples of a scroll operation;
- FIGS. 10a to 10c
- an example of a point selection in a diagram;
- FIGS. 11a to 11f
- Examples of chart manipulations;
- FIG. 12
- an example of left-handed / right-handed recognition;
- FIGS. 13a to 13c
- Examples of line generation or manipulation;
- FIGS. 14a to 14h
- Examples of manipulating image representations for patient records;
- FIGS. 15a to 15d
- Examples of point assignments;
- FIG. 16
- an example of a command confirmation;
- FIG. 17
- an example of an object measurement;
- Figures 18a and 18b
- Examples of circular contour generation;
- FIG. 19
- an example of the manipulation of an implant;
- FIGS. 20a to 20c
- an example of an image content dependent input interpretation;
- FIG. 21
- an example of a countdown setting;
- FIG. 22
- an example of a signature input; and
- FIGS. 23a to 23c
- Examples of the manipulation of several picture elements.
Die
Durch den integralen Einbau der Datenverarbeitungseinheit 4 entsteht eine abgeschlossene Einheit, die - ebenso wie ein Lichtkasten - an einer Wand befestigt werden kann. Die beiden Bildschirme bzw. Monitore 2 und 3 sind nebeneinander angeordnet, wobei der kleinere Monitor 3 bei einer bevorzugten Ausführungsform eine Steuerungsschnittstelle (beispielsweise für den Datentransfer, die Eingabebefehls-Zuordnung oder die Auswahl von Bildern bzw. Bilddaten) bereitstellt und auf dem größeren Monitor die Bilder selbst dargestellt werden. Die Breite des kleineren Monitors 3 entspricht der Höhe des größeren Monitors 2, und der kleinere Monitor 3 ist um 90 Grad gedreht, wodurch eine große Bedienfläche entsteht.Due to the integral installation of the
Anhand der
Wenn hier und im weiteren der Begriff "Berührung" für eine Eingabe am Bildschirm gewählt wird, so soll dieser Begriff zumindest die beiden oben schon angesprochenen Eingabearten am Bildschirm umfassen, nämlich einerseits eine tatsächliche Berührung des Bildschirm und andererseits die Erzeugung einer Präsenz an der Bildschirmoberfläche bzw. in einer (moderaten) Entfernung von der Bildschirmoberfläche. Wie in der
Die
Die
Ein Vergrößerungs-Merkmal gemäß der vorliegenden Erfindung kann anhand der
Eine Ausführungsvariante, bei der mit Hilfe eines erfindungsgemäßen Verfahrens ein Polygonzug erzeugt wird, ist in den
Eine weitere Bildmanipulation wird anhand der
Eine zweihändige Kipp- bzw. Spiegelungsgeste ist in den
Die in den
Die Ausführungsvariante gemäß den
Eine Scrollbar-Bedienung bzw. -Auswahl unter Verwendung der vorliegenden Erfindung wird anhand der
Eine Erfindungsvariante, bei der Diagramme manipuliert werden, ist in den
Auch die Manipulation von Diagrammen ist als Ausführungsbeispiel der Erfindung möglich. Ein Diagramm kann beispielsweise, wie in den
Die
Es ist grundsätzlich auch möglich, mit Hilfe des erfindungsgemäßen Verfahrens das Bildmaterial bzw. die Bilddatensätze zu ergänzen, insbesondere Objekte oder Hilfslinien einzuzeichnen. Die
Bei diesem Beispiel wird auch deutlich, dass bei der vorliegenden Erfindung ganz allgemein die Bedeutung der Eingabegesten von einem vorher zu wählenden oder sich selbst durch die Gesten ergebenden bzw. aus der Geste identifizierbaren Eingabemodus abhängen kann.In this example, it will also be appreciated that in the present invention, quite generally, the meaning of the input gesture may depend on an input mode to be previously selected or even gesture-identifiable or gesture-identifiable.
Zwei- und dreidimensionale Bildmanipulationen, wie sie mit Hilfe der vorliegenden Erfindung ausführbar sind, sind beispielhaft in den
So ist in
Die
Wenn, wie in den
Die
Ein weiteres Anwendungsfeld der Erfindung betrifft das sogenannte "Pairing", d.h. das Zuordnen zweier oder mehrerer Objektpunkte. Beispielsweise bei der Registrierung beim Fusionieren oder beim Matching von zwei verschiedenen Bildern können Einzelpunkte aus beiden Bildern als derselbe Objektpunkt identifiziert und zugeordnet werden, und so zeigen die
Weil beim versehentlichen Löschen mancher Bilder Informationen verloren gehen können, kann eine erfindungsgemäß ausgestaltete Applikation auch hier mehr Sicherheit bieten. Beispielsweise zeigt die
Die
Auch medizinische Implantate können in ihrer Darstellung auf dem Bildschirm manipuliert werden, wie beispielsweise die
Anhand der
Manchmal müssen in Operationssälen gewisse Zeiten eingehalten werden, speziell gewisse Wartezeiten, wenn beispielsweise ein Material aushärten muss. Um diese Wartezeiten messen zu können, kann in einem geeigneten Modus die Gestenerkennung zur Darstellung und zum Einstellen einer Uhr bzw. eines Countdowns genutzt werden, und in
Eine weitere Anwendung ist eine Signatur, die durch Mehrfachberührung des Bildschirms eingegeben wird. Wenn gleichzeitig oder aufeinander folgend mit zwei Händen und durch entsprechende Gesten eine Linienfolge eingegeben wird, kann dies zu einer sehr eindeutigen Identifizierung des Verwenders genutzt werden.Another application is a signature that is entered by multiple touch of the screen. If a line sequence is entered simultaneously or consecutively with two hands and corresponding gestures, this can be used for a very unambiguous identification of the user.
Die Ausführungsform, die anhand der
Claims (31)
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP07014276A EP2017756A1 (en) | 2007-07-20 | 2007-07-20 | Method for displaying and/or processing or manipulating image data for medical purposes with gesture recognition |
| EP08151038A EP2031531A3 (en) | 2007-07-20 | 2008-02-04 | Integrated medical technical display system |
| US12/176,027 US20090021475A1 (en) | 2007-07-20 | 2008-07-18 | Method for displaying and/or processing image data of medical origin using gesture recognition |
| US12/176,107 US20090021476A1 (en) | 2007-07-20 | 2008-07-18 | Integrated medical display system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP07014276A EP2017756A1 (en) | 2007-07-20 | 2007-07-20 | Method for displaying and/or processing or manipulating image data for medical purposes with gesture recognition |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP2017756A1 true EP2017756A1 (en) | 2009-01-21 |
Family
ID=38477329
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP07014276A Ceased EP2017756A1 (en) | 2007-07-20 | 2007-07-20 | Method for displaying and/or processing or manipulating image data for medical purposes with gesture recognition |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US20090021475A1 (en) |
| EP (1) | EP2017756A1 (en) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2012025159A1 (en) | 2010-08-27 | 2012-03-01 | Brainlab Ag | Multiple-layer pointing position determination on a medical display |
| CN103153171A (en) * | 2010-08-30 | 2013-06-12 | 富士胶片株式会社 | Medical information display device, method and program |
| CN104049639A (en) * | 2014-06-24 | 2014-09-17 | 上海大学 | Unmanned surface vehicle anti-surge control device and method based on support vector regression |
| DE102014107966A1 (en) * | 2014-06-05 | 2015-12-17 | Atlas Elektronik Gmbh | Screen, sonar and watercraft |
| US9928570B2 (en) | 2014-10-01 | 2018-03-27 | Calgary Scientific Inc. | Method and apparatus for precision measurements on a touch screen |
| US20230293248A1 (en) * | 2016-09-27 | 2023-09-21 | Brainlab Ag | Efficient positioning of a mechatronic arm |
Families Citing this family (43)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101070943B1 (en) * | 2008-07-10 | 2011-10-06 | 삼성메디슨 주식회사 | Ultrasound system having virtual keyboard and method of controlling the same |
| DE102009004898A1 (en) * | 2009-01-16 | 2010-08-19 | Siemens Aktiengesellschaft | Method for displaying two different images of a fusion image and device therefor |
| WO2010148127A2 (en) | 2009-06-16 | 2010-12-23 | Medicomp Systems, Inc. | Caregiver interface for electronic medical records |
| US20110099476A1 (en) * | 2009-10-23 | 2011-04-28 | Microsoft Corporation | Decorating a display environment |
| US20110113329A1 (en) * | 2009-11-09 | 2011-05-12 | Michael Pusateri | Multi-touch sensing device for use with radiological workstations and associated methods of use |
| US9996971B2 (en) * | 2009-11-25 | 2018-06-12 | Carestream Health, Inc. | System providing companion images |
| JP5750875B2 (en) * | 2010-12-01 | 2015-07-22 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
| US8839150B2 (en) * | 2010-02-10 | 2014-09-16 | Apple Inc. | Graphical objects that respond to touch or motion input |
| US20130112202A1 (en) * | 2010-05-07 | 2013-05-09 | Petter Fogelbrink | User interface for breathing apparatus |
| US10042546B2 (en) * | 2011-01-07 | 2018-08-07 | Qualcomm Incorporated | Systems and methods to present multiple frames on a touch screen |
| US20130055139A1 (en) * | 2011-02-21 | 2013-02-28 | David A. Polivka | Touch interface for documentation of patient encounter |
| US8947429B2 (en) | 2011-04-12 | 2015-02-03 | Autodesk, Inc. | Gestures and tools for creating and editing solid models |
| US8902222B2 (en) | 2012-01-16 | 2014-12-02 | Autodesk, Inc. | Three dimensional contriver tool for modeling with multi-touch devices |
| US9182882B2 (en) | 2011-04-12 | 2015-11-10 | Autodesk, Inc. | Dynamic creation and modeling of solid models |
| US8860726B2 (en) | 2011-04-12 | 2014-10-14 | Autodesk, Inc. | Transform manipulator control |
| WO2012145011A1 (en) * | 2011-04-22 | 2012-10-26 | Hewlett-Packard Development Company, L.P. | Systems and methods for displaying data on large interactive devices |
| RU2611977C2 (en) | 2011-05-09 | 2017-03-01 | Конинклейке Филипс Н.В. | Rotating object on screen |
| JP5309187B2 (en) * | 2011-05-26 | 2013-10-09 | 富士フイルム株式会社 | MEDICAL INFORMATION DISPLAY DEVICE, ITS OPERATION METHOD, AND MEDICAL INFORMATION DISPLAY PROGRAM |
| US8860675B2 (en) | 2011-07-12 | 2014-10-14 | Autodesk, Inc. | Drawing aid system for multi-touch devices |
| US9251144B2 (en) * | 2011-10-19 | 2016-02-02 | Microsoft Technology Licensing, Llc | Translating language characters in media content |
| CA2794898C (en) | 2011-11-10 | 2019-10-29 | Victor Yang | Method of rendering and manipulating anatomical images on mobile computing device |
| WO2013109244A1 (en) * | 2012-01-16 | 2013-07-25 | Autodesk, Inc. | Three dimensional contriver tool for modeling with multi-touch devices |
| WO2013109245A1 (en) * | 2012-01-16 | 2013-07-25 | Autodesk, Inc. | Dynamic creation and modeling of solid models |
| WO2013109246A1 (en) * | 2012-01-16 | 2013-07-25 | Autodesk, Inc. | Gestures and tools for creating and editing solid models |
| US10503373B2 (en) * | 2012-03-14 | 2019-12-10 | Sony Interactive Entertainment LLC | Visual feedback for highlight-driven gesture user interfaces |
| JP5678913B2 (en) * | 2012-03-15 | 2015-03-04 | コニカミノルタ株式会社 | Information equipment and computer programs |
| US9134901B2 (en) * | 2012-03-26 | 2015-09-15 | International Business Machines Corporation | Data analysis using gestures |
| US9324188B1 (en) * | 2012-04-30 | 2016-04-26 | Dr Systems, Inc. | Manipulation of 3D medical objects |
| CN103777857A (en) * | 2012-10-24 | 2014-05-07 | 腾讯科技(深圳)有限公司 | Method and device for rotating video picture |
| US10226230B2 (en) * | 2013-06-10 | 2019-03-12 | B-K Medical Aps | Ultrasound imaging system image identification and display |
| KR102166330B1 (en) | 2013-08-23 | 2020-10-15 | 삼성메디슨 주식회사 | Method and apparatus for providing user interface of medical diagnostic apparatus |
| US9815087B2 (en) | 2013-12-12 | 2017-11-14 | Qualcomm Incorporated | Micromechanical ultrasonic transducers and display |
| DE102013226973B4 (en) * | 2013-12-20 | 2019-03-28 | Siemens Healthcare Gmbh | Method and device for simultaneously displaying a medical image and a graphic control element |
| WO2015127378A1 (en) | 2014-02-21 | 2015-08-27 | Medicomp Systems, Inc. | Intelligent prompting of protocols |
| KR20150120774A (en) | 2014-04-18 | 2015-10-28 | 삼성전자주식회사 | System and method for detecting region of interest |
| WO2015164402A1 (en) * | 2014-04-22 | 2015-10-29 | Surgerati, Llc | Intra-operative medical image viewing system and method |
| US11977998B2 (en) | 2014-05-15 | 2024-05-07 | Storz Endoskop Produktions Gmbh | Surgical workflow support system |
| JP2017533487A (en) | 2014-08-15 | 2017-11-09 | ザ・ユニバーシティ・オブ・ブリティッシュ・コロンビア | Method and system for performing medical procedures and accessing and / or manipulating medical related information |
| US11347316B2 (en) | 2015-01-28 | 2022-05-31 | Medtronic, Inc. | Systems and methods for mitigating gesture input error |
| US10613637B2 (en) * | 2015-01-28 | 2020-04-07 | Medtronic, Inc. | Systems and methods for mitigating gesture input error |
| US10600015B2 (en) | 2015-06-24 | 2020-03-24 | Karl Storz Se & Co. Kg | Context-aware user interface for integrated operating room |
| JP6784115B2 (en) * | 2016-09-23 | 2020-11-11 | コニカミノルタ株式会社 | Ultrasound diagnostic equipment and programs |
| EP3582707B1 (en) | 2017-02-17 | 2025-08-06 | NZ Technologies Inc. | Methods and systems for touchless control of surgical environment |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE19845030A1 (en) * | 1998-09-30 | 2000-04-20 | Siemens Ag | Imaging system for reproduction of medical image information |
| US6424332B1 (en) * | 1999-01-29 | 2002-07-23 | Hunter Innovations, Inc. | Image comparison apparatus and method |
| WO2006020305A2 (en) * | 2004-07-30 | 2006-02-23 | Apple Computer, Inc. | Gestures for touch sensitive input devices |
Family Cites Families (54)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4506354A (en) * | 1982-09-30 | 1985-03-19 | Position Orientation Systems, Ltd. | Ultrasonic position detecting system |
| US6400996B1 (en) * | 1999-02-01 | 2002-06-04 | Steven M. Hoffberg | Adaptive pattern recognition based control system and method |
| US5875108A (en) * | 1991-12-23 | 1999-02-23 | Hoffberg; Steven M. | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
| US5903454A (en) * | 1991-12-23 | 1999-05-11 | Hoffberg; Linda Irene | Human-factored interface corporating adaptive pattern recognition based controller apparatus |
| US6978166B2 (en) * | 1994-10-07 | 2005-12-20 | Saint Louis University | System for use in displaying images of a body part |
| US5797849A (en) * | 1995-03-28 | 1998-08-25 | Sonometrics Corporation | Method for carrying out a medical procedure using a three-dimensional tracking and imaging system |
| US6107997A (en) * | 1996-06-27 | 2000-08-22 | Ure; Michael J. | Touch-sensitive keyboard/mouse and computing device using the same |
| US5825308A (en) * | 1996-11-26 | 1998-10-20 | Immersion Human Interface Corporation | Force feedback interface having isotonic and isometric functionality |
| US5682886A (en) * | 1995-12-26 | 1997-11-04 | Musculographics Inc | Computer-assisted surgical system |
| US5988862A (en) * | 1996-04-24 | 1999-11-23 | Cyra Technologies, Inc. | Integrated system for quickly and accurately imaging and modeling three dimensional objects |
| JPH1063409A (en) * | 1996-08-26 | 1998-03-06 | Fuji Electric Co Ltd | Cancel operation method |
| US5810008A (en) * | 1996-12-03 | 1998-09-22 | Isg Technologies Inc. | Apparatus and method for visualizing ultrasonic images |
| US6720949B1 (en) * | 1997-08-22 | 2004-04-13 | Timothy R. Pryor | Man machine interfaces and applications |
| US20020036617A1 (en) * | 1998-08-21 | 2002-03-28 | Timothy R. Pryor | Novel man machine interfaces and applications |
| US6226548B1 (en) * | 1997-09-24 | 2001-05-01 | Surgical Navigation Technologies, Inc. | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation |
| US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
| US6175610B1 (en) * | 1998-02-11 | 2001-01-16 | Siemens Aktiengesellschaft | Medical technical system controlled by vision-detected operator activity |
| US6313853B1 (en) * | 1998-04-16 | 2001-11-06 | Nortel Networks Limited | Multi-service user interface |
| US6511426B1 (en) * | 1998-06-02 | 2003-01-28 | Acuson Corporation | Medical diagnostic ultrasound system and method for versatile processing |
| US6091378A (en) * | 1998-06-17 | 2000-07-18 | Eye Control Technologies, Inc. | Video processing methods and apparatus for gaze point tracking |
| US6424996B1 (en) * | 1998-11-25 | 2002-07-23 | Nexsys Electronics, Inc. | Medical network system and method for transfer of information |
| JP2003530131A (en) * | 1999-03-07 | 2003-10-14 | ディスクレ リミテッド | Surgical method and apparatus using computer |
| JP4067220B2 (en) * | 1999-03-25 | 2008-03-26 | 富士フイルム株式会社 | Quality control system for medical diagnostic equipment |
| SE0000850D0 (en) * | 2000-03-13 | 2000-03-13 | Pink Solution Ab | Recognition arrangement |
| US6574511B2 (en) * | 2000-04-21 | 2003-06-03 | Medtronic, Inc. | Passive data collection system from a fleet of medical instruments and implantable devices |
| US6931254B1 (en) * | 2000-08-21 | 2005-08-16 | Nortel Networks Limited | Personalized presentation system and method |
| US20020186818A1 (en) * | 2000-08-29 | 2002-12-12 | Osteonet, Inc. | System and method for building and manipulating a centralized measurement value database |
| JP4176299B2 (en) * | 2000-09-29 | 2008-11-05 | 富士フイルム株式会社 | Medical image display system |
| US7095401B2 (en) * | 2000-11-02 | 2006-08-22 | Siemens Corporate Research, Inc. | System and method for gesture interface |
| CA2368923C (en) * | 2001-01-25 | 2006-03-14 | Jsj Seating Company Texas, L.P. | Office chair |
| US7030861B1 (en) * | 2001-02-10 | 2006-04-18 | Wayne Carl Westerman | System and method for packing multi-touch gestures onto a hand |
| WO2003009269A2 (en) * | 2001-07-18 | 2003-01-30 | Daniel Dunn | Multiple flat panel display system |
| EP1550024A2 (en) * | 2002-06-21 | 2005-07-06 | Cedara Software Corp. | Computer assisted system and method for minimal invasive hip, uni knee and total knee replacement |
| US6857746B2 (en) * | 2002-07-01 | 2005-02-22 | Io2 Technology, Llc | Method and system for free-space imaging display and interface |
| US20040109608A1 (en) * | 2002-07-12 | 2004-06-10 | Love Patrick B. | Systems and methods for analyzing two-dimensional images |
| US8745541B2 (en) * | 2003-03-25 | 2014-06-03 | Microsoft Corporation | Architecture for controlling a computer using hand gestures |
| FI114831B (en) * | 2003-05-07 | 2004-12-31 | Tekla Corp | Computer-aided model design |
| DE10325382A1 (en) * | 2003-05-30 | 2004-12-23 | Karl Storz Gmbh & Co. Kg | Method and device for visualizing medical patient data on a medical display unit |
| US7768500B2 (en) * | 2003-06-16 | 2010-08-03 | Humanscale Corporation | Ergonomic pointing device |
| US20050267353A1 (en) * | 2004-02-04 | 2005-12-01 | Joel Marquart | Computer-assisted knee replacement apparatus and method |
| US20060001654A1 (en) * | 2004-06-30 | 2006-01-05 | National Semiconductor Corporation | Apparatus and method for performing data entry with light based touch screen displays |
| US20060001656A1 (en) * | 2004-07-02 | 2006-01-05 | Laviola Joseph J Jr | Electronic ink system |
| US8560972B2 (en) * | 2004-08-10 | 2013-10-15 | Microsoft Corporation | Surface UI for gesture-based interaction |
| US20060129417A1 (en) * | 2004-12-14 | 2006-06-15 | Design Logic, Inc. | Systems and methods for logo design |
| US8398541B2 (en) * | 2006-06-06 | 2013-03-19 | Intuitive Surgical Operations, Inc. | Interactive user interfaces for robotic minimally invasive surgical systems |
| US9640017B2 (en) * | 2005-08-31 | 2017-05-02 | Igt | Gaming system and method employing rankings of outcomes from multiple gaming machines to determine awards |
| US20070073133A1 (en) * | 2005-09-15 | 2007-03-29 | Schoenefeld Ryan J | Virtual mouse for use in surgical navigation |
| US8167805B2 (en) * | 2005-10-20 | 2012-05-01 | Kona Medical, Inc. | Systems and methods for ultrasound applicator station keeping |
| US20070120763A1 (en) * | 2005-11-23 | 2007-05-31 | Lode De Paepe | Display system for viewing multiple video signals |
| US7558622B2 (en) * | 2006-05-24 | 2009-07-07 | Bao Tran | Mesh network stroke monitoring appliance |
| US8811692B2 (en) * | 2007-04-17 | 2014-08-19 | Francine J. Prokoski | System and method for using three dimensional infrared imaging for libraries of standardized medical imagery |
| US8166421B2 (en) * | 2008-01-14 | 2012-04-24 | Primesense Ltd. | Three-dimensional user interface |
| US8054116B2 (en) * | 2008-01-23 | 2011-11-08 | Qualcomm Incorporated | Threshold dithering for time-to-digital converters |
| US8155479B2 (en) * | 2008-03-28 | 2012-04-10 | Intuitive Surgical Operations Inc. | Automated panning and digital zooming for robotic surgical systems |
-
2007
- 2007-07-20 EP EP07014276A patent/EP2017756A1/en not_active Ceased
-
2008
- 2008-07-18 US US12/176,027 patent/US20090021475A1/en not_active Abandoned
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE19845030A1 (en) * | 1998-09-30 | 2000-04-20 | Siemens Ag | Imaging system for reproduction of medical image information |
| US6424332B1 (en) * | 1999-01-29 | 2002-07-23 | Hunter Innovations, Inc. | Image comparison apparatus and method |
| WO2006020305A2 (en) * | 2004-07-30 | 2006-02-23 | Apple Computer, Inc. | Gestures for touch sensitive input devices |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2012025159A1 (en) | 2010-08-27 | 2012-03-01 | Brainlab Ag | Multiple-layer pointing position determination on a medical display |
| CN103153171A (en) * | 2010-08-30 | 2013-06-12 | 富士胶片株式会社 | Medical information display device, method and program |
| DE102014107966A1 (en) * | 2014-06-05 | 2015-12-17 | Atlas Elektronik Gmbh | Screen, sonar and watercraft |
| CN104049639A (en) * | 2014-06-24 | 2014-09-17 | 上海大学 | Unmanned surface vehicle anti-surge control device and method based on support vector regression |
| CN104049639B (en) * | 2014-06-24 | 2016-12-07 | 上海大学 | A kind of unmanned boat antisurge based on support vector regression controls apparatus and method |
| US9928570B2 (en) | 2014-10-01 | 2018-03-27 | Calgary Scientific Inc. | Method and apparatus for precision measurements on a touch screen |
| US20230293248A1 (en) * | 2016-09-27 | 2023-09-21 | Brainlab Ag | Efficient positioning of a mechatronic arm |
| US12114944B2 (en) * | 2016-09-27 | 2024-10-15 | Brainlab Ag | Efficient positioning of a mechatronic arm |
Also Published As
| Publication number | Publication date |
|---|---|
| US20090021475A1 (en) | 2009-01-22 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP2017756A1 (en) | Method for displaying and/or processing or manipulating image data for medical purposes with gesture recognition | |
| DE102009032637B4 (en) | Image magnification system for a computer interface | |
| DE69233600T2 (en) | Device for manipulating an object displayed on a screen | |
| DE3787827T2 (en) | Command entry system for an electronic computer. | |
| DE4406668C2 (en) | Method and device for operating a touch-sensitive display device | |
| DE69026647T2 (en) | Zoom mode modes in a display device | |
| EP3400515A1 (en) | User interface comprising a plurality of display units, and method for positioning contents on a plurality of display units | |
| US8402386B2 (en) | Method and apparatus for two-dimensional scrolling in a graphical display window | |
| DE102013007250A1 (en) | Procedure for gesture control | |
| EP3903172A1 (en) | Method and arrangement for outputting a head-up display on a head-mounted display | |
| EP2795451B1 (en) | Method for operating a multi-touch-capable display and device having a multi-touch-capable display | |
| DE69221204T2 (en) | Data processing device for window position control | |
| DE102013203918A1 (en) | A method of operating a device in a sterile environment | |
| DE69026516T2 (en) | DIGITIZED TABLET WITH TWO-WAY RUNNER / MOUSE | |
| US5526018A (en) | Stretching scales for computer documents or drawings | |
| DE202017105674U1 (en) | Control a window using a touch-sensitive edge | |
| DE102016204692A1 (en) | Control of multiple selection on touch-sensitive surfaces | |
| DE102012203163A1 (en) | Apparatus and method for exchanging information between at least one operator and one machine | |
| DE112019002798T5 (en) | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM | |
| EP1308832A2 (en) | Electronic device | |
| DE102009003995A1 (en) | Method for enlarging a display area on a presentation device | |
| EP1019800B1 (en) | System for capturing and processing user entries | |
| EP1881398B1 (en) | Method for positioning a cursor on a touch-sensitive screen | |
| WO2006032442A1 (en) | Control device for displays | |
| EP3159785A2 (en) | User interface and method for interactive selection of a display |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| 17P | Request for examination filed |
Effective date: 20070720 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR |
|
| AX | Request for extension of the european patent |
Extension state: AL BA HR MK RS |
|
| AKX | Designation fees paid |
Designated state(s): DE FR GB |
|
| 17Q | First examination report despatched |
Effective date: 20090916 |
|
| RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: BRAINLAB AG |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
| 18R | Application refused |
Effective date: 20160430 |